Apr 18 02:43:30.056379 ip-10-0-140-103 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 18 02:43:30.056395 ip-10-0-140-103 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 18 02:43:30.056405 ip-10-0-140-103 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 18 02:43:30.056661 ip-10-0-140-103 systemd[1]: Failed to start Kubernetes Kubelet. Apr 18 02:43:40.130191 ip-10-0-140-103 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 18 02:43:40.130211 ip-10-0-140-103 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot adc9add1d200466aa43c9f6a77707c7e -- Apr 18 02:45:40.357945 ip-10-0-140-103 systemd[1]: Starting Kubernetes Kubelet... Apr 18 02:45:40.985853 ip-10-0-140-103 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:40.985853 ip-10-0-140-103 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 18 02:45:40.985853 ip-10-0-140-103 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:40.985853 ip-10-0-140-103 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 18 02:45:40.985853 ip-10-0-140-103 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:45:40.987679 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.987566 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 18 02:45:40.990637 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990611 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990642 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990646 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990650 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990653 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990656 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990660 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990663 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990666 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990669 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990679 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990683 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:40.990681 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990686 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990689 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990693 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990696 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990699 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990702 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990704 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990707 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990710 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990713 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990715 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990718 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990721 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990723 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990726 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990728 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990731 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990734 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990737 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:40.990968 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990739 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990742 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990744 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990748 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990750 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990753 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990756 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990758 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990760 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990763 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990765 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990768 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990771 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990773 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990776 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990779 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990782 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990785 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990788 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990790 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:40.991410 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990793 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990795 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990798 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990800 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990803 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990805 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990808 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990810 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990813 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990816 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990818 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990821 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990823 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990826 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990828 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990831 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990833 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990836 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990839 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990844 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:40.991953 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990848 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990851 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990854 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990857 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990859 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990863 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990867 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990872 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990874 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990878 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990880 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990883 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990885 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990888 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.990890 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991880 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991887 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991890 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991893 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991897 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:40.992496 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991900 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991903 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991905 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991908 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991911 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991914 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991918 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991921 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991924 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991926 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991928 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991931 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991934 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991937 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991941 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991944 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991947 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991950 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991953 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:40.993012 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991957 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991960 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991963 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991965 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991968 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991971 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991973 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991976 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991979 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991981 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991985 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991988 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991990 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991992 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991995 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.991997 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992000 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992002 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992005 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:40.993750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992007 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992010 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992013 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992015 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992018 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992021 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992024 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992027 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992029 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992032 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992034 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992036 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992042 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992045 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992047 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992050 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992053 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992055 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992058 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992060 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992063 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:40.994320 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992066 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992069 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992071 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992074 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992077 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992079 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992081 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992084 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992086 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992089 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992091 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992094 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992097 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992100 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992102 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992105 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992107 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992111 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992114 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992117 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:40.994861 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992119 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.992122 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992196 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992204 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992210 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992215 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992220 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992224 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992229 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992234 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992237 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992241 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992245 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992248 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992251 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992254 2575 flags.go:64] FLAG: --cgroup-root="" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992257 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992260 2575 flags.go:64] FLAG: --client-ca-file="" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992263 2575 flags.go:64] FLAG: --cloud-config="" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992266 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992269 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992272 2575 flags.go:64] FLAG: --cluster-domain="" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992276 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992279 2575 flags.go:64] FLAG: --config-dir="" Apr 18 02:45:40.995352 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992282 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992285 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992289 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992292 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992295 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992299 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992302 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992305 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992308 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992311 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992314 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992318 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992322 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992325 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992329 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992332 2575 flags.go:64] FLAG: --enable-server="true" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992335 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992339 2575 flags.go:64] FLAG: --event-burst="100" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992342 2575 flags.go:64] FLAG: --event-qps="50" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992345 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992349 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992352 2575 flags.go:64] FLAG: --eviction-hard="" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992356 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992359 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992362 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 18 02:45:40.995952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992366 2575 flags.go:64] FLAG: --eviction-soft="" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992369 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992372 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992375 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992378 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992381 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992384 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992387 2575 flags.go:64] FLAG: --feature-gates="" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992391 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992394 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992398 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992401 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992404 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992407 2575 flags.go:64] FLAG: --help="false" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992410 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-103.ec2.internal" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992414 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992416 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992419 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992423 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992427 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992430 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992433 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992436 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 18 02:45:40.996581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992439 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992442 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992445 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992448 2575 flags.go:64] FLAG: --kube-reserved="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992451 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992454 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992457 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992460 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992463 2575 flags.go:64] FLAG: --lock-file="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992465 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992468 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992471 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992477 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992480 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992482 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992492 2575 flags.go:64] FLAG: --logging-format="text" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992496 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992500 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992503 2575 flags.go:64] FLAG: --manifest-url="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992507 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992512 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992515 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992519 2575 flags.go:64] FLAG: --max-pods="110" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992523 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992526 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 18 02:45:40.997184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992529 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992532 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992536 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992539 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992542 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992551 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992555 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992558 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992561 2575 flags.go:64] FLAG: --pod-cidr="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992564 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992570 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992573 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992576 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992579 2575 flags.go:64] FLAG: --port="10250" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992582 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992586 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f4e70e9e01179264" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992589 2575 flags.go:64] FLAG: --qos-reserved="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992592 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992595 2575 flags.go:64] FLAG: --register-node="true" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992598 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992601 2575 flags.go:64] FLAG: --register-with-taints="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992605 2575 flags.go:64] FLAG: --registry-burst="10" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992608 2575 flags.go:64] FLAG: --registry-qps="5" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992611 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992614 2575 flags.go:64] FLAG: --reserved-memory="" Apr 18 02:45:40.997814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992618 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992621 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992623 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992626 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992643 2575 flags.go:64] FLAG: --runonce="false" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992646 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992650 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992652 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992655 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992661 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992664 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992668 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992671 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992674 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992678 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992681 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992684 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992687 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992690 2575 flags.go:64] FLAG: --system-cgroups="" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992693 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992698 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992701 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992704 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992707 2575 flags.go:64] FLAG: --tls-min-version="" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992710 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 18 02:45:40.998421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992713 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992716 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992719 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992722 2575 flags.go:64] FLAG: --v="2" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992726 2575 flags.go:64] FLAG: --version="false" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992730 2575 flags.go:64] FLAG: --vmodule="" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992734 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.992738 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993459 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993478 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993483 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993487 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993491 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993497 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993502 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993507 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993514 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993518 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993522 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993527 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993531 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993540 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:40.999033 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993545 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993550 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993554 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993558 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993562 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993566 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993571 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993575 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993579 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993584 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993649 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993656 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993669 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993674 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993679 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993705 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993735 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993742 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993748 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993753 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:40.999559 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993759 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993764 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993769 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993775 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993780 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993787 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993792 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993802 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993807 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993811 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993816 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993821 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993826 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993831 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993835 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993839 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993844 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993852 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993860 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993865 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:41.000074 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993875 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993879 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993883 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993888 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993895 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993900 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993905 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993909 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993915 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993921 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993928 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993933 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993943 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.993948 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994012 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994161 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994173 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994178 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994183 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994189 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:41.000653 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994193 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994198 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994202 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994206 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994211 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994215 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994219 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994226 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994233 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994237 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994242 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:40.994247 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:41.001150 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:40.995242 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:41.002144 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.002024 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 18 02:45:41.002183 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.002147 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002196 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002202 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002206 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002209 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002212 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:41.002214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002214 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002217 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002221 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002224 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002226 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002229 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002231 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002236 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002241 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002244 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002246 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002249 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002252 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002255 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002258 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002261 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002264 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002266 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002269 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:41.002365 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002272 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002274 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002277 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002279 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002282 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002284 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002288 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002291 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002293 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002296 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002298 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002301 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002303 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002307 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002309 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002312 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002315 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002318 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002321 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002323 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:41.002851 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002326 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002328 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002331 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002333 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002336 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002339 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002342 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002344 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002347 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002349 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002352 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002355 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002357 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002360 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002363 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002366 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002370 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002374 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002377 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:41.003339 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002380 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002383 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002386 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002389 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002392 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002394 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002398 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002401 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002404 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002407 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002409 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002412 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002415 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002417 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002420 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002423 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002426 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002428 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002431 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002433 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:41.003822 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002436 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002438 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002441 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.002447 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002548 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002552 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002556 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002559 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002562 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002565 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002567 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002570 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002573 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002575 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002578 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002581 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:45:41.004313 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002585 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002588 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002591 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002596 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002599 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002602 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002604 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002607 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002609 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002612 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002615 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002618 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002620 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002623 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002625 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002642 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002646 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002649 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002651 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:45:41.004742 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002654 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002657 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002660 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002663 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002665 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002668 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002670 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002673 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002676 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002678 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002681 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002683 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002686 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002689 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002691 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002694 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002697 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002700 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002702 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002705 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:45:41.005223 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002707 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002710 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002712 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002715 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002718 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002720 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002723 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002727 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002729 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002732 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002735 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002737 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002740 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002743 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002745 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002748 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002750 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002754 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002758 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002761 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:45:41.005734 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002764 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002767 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002770 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002772 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002775 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002778 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002780 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002783 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002785 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002788 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002791 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002793 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002795 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002798 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:41.002800 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:45:41.006214 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.002806 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:45:41.006587 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.003520 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 18 02:45:41.008742 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.008728 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 18 02:45:41.009757 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.009720 2575 server.go:1019] "Starting client certificate rotation" Apr 18 02:45:41.009892 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.009824 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:45:41.009892 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.009874 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:45:41.036331 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.036294 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:45:41.040308 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.040279 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:45:41.052836 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.052810 2575 log.go:25] "Validated CRI v1 runtime API" Apr 18 02:45:41.058373 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.058352 2575 log.go:25] "Validated CRI v1 image API" Apr 18 02:45:41.060294 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.060272 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:45:41.062827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.062808 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 18 02:45:41.066835 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.066811 2575 fs.go:135] Filesystem UUIDs: map[1f02cf1b-a4ca-4abf-b9a2-9b96a7a9ab2a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a324aa3c-3cb2-416e-8607-1b76deb91c4e:/dev/nvme0n1p3] Apr 18 02:45:41.066907 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.066835 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 18 02:45:41.072506 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.072362 2575 manager.go:217] Machine: {Timestamp:2026-04-18 02:45:41.070695579 +0000 UTC m=+0.577158056 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098687 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f948d793ac7e5600b67fcebd0057a SystemUUID:ec2f948d-793a-c7e5-600b-67fcebd0057a BootID:adc9add1-d200-466a-a43c-9f6a77707c7e Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:49:89:0f:7a:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:49:89:0f:7a:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:25:c2:ef:56:98 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 18 02:45:41.073038 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.073026 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 18 02:45:41.073140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.073128 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 18 02:45:41.074856 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.074825 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 18 02:45:41.075012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.074860 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-103.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 18 02:45:41.075063 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.075018 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 18 02:45:41.075063 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.075027 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 18 02:45:41.075063 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.075039 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:45:41.076133 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.076121 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:45:41.077509 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.077495 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:45:41.077643 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.077622 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 18 02:45:41.079823 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.079812 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 18 02:45:41.079861 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.079828 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 18 02:45:41.079861 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.079840 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 18 02:45:41.079861 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.079850 2575 kubelet.go:397] "Adding apiserver pod source" Apr 18 02:45:41.079861 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.079861 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 18 02:45:41.081120 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.081107 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:45:41.081172 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.081126 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:45:41.084655 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.084617 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 18 02:45:41.086073 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.086060 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 18 02:45:41.087838 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087819 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 18 02:45:41.087838 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087839 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087845 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087850 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087856 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087862 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087868 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087873 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087880 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087886 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087897 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 18 02:45:41.087964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.087906 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 18 02:45:41.088824 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.088813 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 18 02:45:41.088824 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.088824 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 18 02:45:41.092413 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.092394 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 18 02:45:41.092552 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.092432 2575 server.go:1295] "Started kubelet" Apr 18 02:45:41.092552 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.092521 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 18 02:45:41.093225 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.093112 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 18 02:45:41.093328 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.093239 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 18 02:45:41.093346 ip-10-0-140-103 systemd[1]: Started Kubernetes Kubelet. Apr 18 02:45:41.093893 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.093750 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 18 02:45:41.094052 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.094034 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-103.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 18 02:45:41.094113 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.094026 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 18 02:45:41.097155 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.097126 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 18 02:45:41.097473 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.097447 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 18 02:45:41.101772 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.101746 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 18 02:45:41.102299 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.102281 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 18 02:45:41.103115 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103093 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 18 02:45:41.103218 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103132 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 18 02:45:41.103218 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.103145 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.103357 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103224 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 18 02:45:41.103357 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103234 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 18 02:45:41.103357 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.103239 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 18 02:45:41.103357 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103093 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 18 02:45:41.103517 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103476 2575 factory.go:55] Registering systemd factory Apr 18 02:45:41.103517 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103493 2575 factory.go:223] Registration of the systemd container factory successfully Apr 18 02:45:41.103795 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103774 2575 factory.go:153] Registering CRI-O factory Apr 18 02:45:41.103795 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103796 2575 factory.go:223] Registration of the crio container factory successfully Apr 18 02:45:41.103916 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103889 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 18 02:45:41.103916 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103908 2575 factory.go:103] Registering Raw factory Apr 18 02:45:41.104012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103932 2575 manager.go:1196] Started watching for new ooms in manager Apr 18 02:45:41.104012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.103985 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lbc4d" Apr 18 02:45:41.104329 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.104313 2575 manager.go:319] Starting recovery of all containers Apr 18 02:45:41.104470 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.103501 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-103.ec2.internal.18a7537541471100 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-103.ec2.internal,UID:ip-10-0-140-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-103.ec2.internal,},FirstTimestamp:2026-04-18 02:45:41.092405504 +0000 UTC m=+0.598867772,LastTimestamp:2026-04-18 02:45:41.092405504 +0000 UTC m=+0.598867772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-103.ec2.internal,}" Apr 18 02:45:41.111289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.111263 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lbc4d" Apr 18 02:45:41.111384 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.111293 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 18 02:45:41.111457 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.111430 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 18 02:45:41.115228 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.115210 2575 manager.go:324] Recovery completed Apr 18 02:45:41.116479 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.116448 2575 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 18 02:45:41.119396 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.119383 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.123158 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123143 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.123220 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123171 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.123220 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123182 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.123704 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123689 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 18 02:45:41.123704 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123701 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 18 02:45:41.123808 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.123718 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:45:41.125593 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.125527 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-103.ec2.internal.18a75375431c4ef3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-103.ec2.internal,UID:ip-10-0-140-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-103.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-103.ec2.internal,},FirstTimestamp:2026-04-18 02:45:41.123157747 +0000 UTC m=+0.629620011,LastTimestamp:2026-04-18 02:45:41.123157747 +0000 UTC m=+0.629620011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-103.ec2.internal,}" Apr 18 02:45:41.126739 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.126723 2575 policy_none.go:49] "None policy: Start" Apr 18 02:45:41.126739 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.126741 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 18 02:45:41.126838 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.126751 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 18 02:45:41.161322 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161304 2575 manager.go:341] "Starting Device Plugin manager" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.161409 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161425 2575 server.go:85] "Starting device plugin registration server" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161706 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161716 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161803 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161875 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.161884 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.162812 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.162845 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.169550 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.170711 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.170734 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.170754 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.170762 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.170794 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 18 02:45:41.185174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.174548 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:41.262547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.262472 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.265658 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.265627 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.265734 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.265675 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.265734 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.265690 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.265734 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.265722 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.271466 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.271446 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal"] Apr 18 02:45:41.271523 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.271515 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.273519 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.273501 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.273567 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.273525 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-103.ec2.internal\": node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.274786 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.274769 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.274884 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.274801 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.274884 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.274814 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.277058 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277043 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.277190 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.277250 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277210 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.277837 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277818 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.277962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277850 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.277962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277819 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.277962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277862 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.277962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277885 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.277962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.277902 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.280205 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.280189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.280286 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.280221 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:45:41.280892 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.280876 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:45:41.280964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.280901 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:45:41.280964 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.280913 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:45:41.287871 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.287854 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.293192 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.293177 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-103.ec2.internal\" not found" node="ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.297065 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.297050 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-103.ec2.internal\" not found" node="ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.388084 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.388041 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.404431 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.404406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/881cc9f87b49496f030449959b9b3b21-config\") pod \"kube-apiserver-proxy-ip-10-0-140-103.ec2.internal\" (UID: \"881cc9f87b49496f030449959b9b3b21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.404528 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.404439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.404528 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.404463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.488548 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.488507 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.504836 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.504888 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.504888 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/881cc9f87b49496f030449959b9b3b21-config\") pod \"kube-apiserver-proxy-ip-10-0-140-103.ec2.internal\" (UID: \"881cc9f87b49496f030449959b9b3b21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.504955 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/881cc9f87b49496f030449959b9b3b21-config\") pod \"kube-apiserver-proxy-ip-10-0-140-103.ec2.internal\" (UID: \"881cc9f87b49496f030449959b9b3b21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.504955 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.505014 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.504918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bcc8811e31056862bf836a7191d135-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal\" (UID: \"e2bcc8811e31056862bf836a7191d135\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.589305 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.589248 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.595438 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.595409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.599761 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:41.599725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:41.689684 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.689622 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.790157 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.790127 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.890690 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.890590 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:41.991122 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:41.991081 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:42.009324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.009299 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 18 02:45:42.009461 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.009443 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:45:42.083550 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:42.083515 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881cc9f87b49496f030449959b9b3b21.slice/crio-396b4024b6ce89f20e93b485ffa82384f7f962e419768fd4682e554b6f857d19 WatchSource:0}: Error finding container 396b4024b6ce89f20e93b485ffa82384f7f962e419768fd4682e554b6f857d19: Status 404 returned error can't find the container with id 396b4024b6ce89f20e93b485ffa82384f7f962e419768fd4682e554b6f857d19 Apr 18 02:45:42.083936 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:42.083909 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bcc8811e31056862bf836a7191d135.slice/crio-f050a32b646d5e6fb4c2d96d7c328b5dd80c54ce9f68ea58a9f6fa429096bbef WatchSource:0}: Error finding container f050a32b646d5e6fb4c2d96d7c328b5dd80c54ce9f68ea58a9f6fa429096bbef: Status 404 returned error can't find the container with id f050a32b646d5e6fb4c2d96d7c328b5dd80c54ce9f68ea58a9f6fa429096bbef Apr 18 02:45:42.088815 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.088794 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:45:42.091743 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:42.091726 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-103.ec2.internal\" not found" Apr 18 02:45:42.101324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.101297 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:42.101837 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.101823 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 18 02:45:42.102996 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.102976 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" Apr 18 02:45:42.111023 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.111002 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:45:42.113717 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.113693 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-17 02:40:41 +0000 UTC" deadline="2027-10-18 23:03:06.458797718 +0000 UTC" Apr 18 02:45:42.113780 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.113717 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13172h17m24.345083189s" Apr 18 02:45:42.115988 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.115973 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:45:42.117820 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.117806 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" Apr 18 02:45:42.125702 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.125684 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:45:42.132327 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.132304 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hc8bp" Apr 18 02:45:42.141363 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.141301 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hc8bp" Apr 18 02:45:42.173764 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.173533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" event={"ID":"e2bcc8811e31056862bf836a7191d135","Type":"ContainerStarted","Data":"f050a32b646d5e6fb4c2d96d7c328b5dd80c54ce9f68ea58a9f6fa429096bbef"} Apr 18 02:45:42.174505 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.174477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" event={"ID":"881cc9f87b49496f030449959b9b3b21","Type":"ContainerStarted","Data":"396b4024b6ce89f20e93b485ffa82384f7f962e419768fd4682e554b6f857d19"} Apr 18 02:45:42.580462 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.580383 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:42.673000 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:42.672840 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:43.081724 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.081694 2575 apiserver.go:52] "Watching apiserver" Apr 18 02:45:43.105053 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.105027 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 18 02:45:43.105459 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.105434 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kbc68","openshift-multus/multus-ckptk","openshift-multus/network-metrics-daemon-l8m94","openshift-network-diagnostics/network-check-target-kd89b","openshift-ovn-kubernetes/ovnkube-node-lsz2j","kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws","openshift-cluster-node-tuning-operator/tuned-wv9n8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal","openshift-multus/multus-additional-cni-plugins-rcpsj","openshift-network-operator/iptables-alerter-lb7px","kube-system/konnectivity-agent-c22bt"] Apr 18 02:45:43.115881 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.115855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.118131 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.118097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wff7q\"" Apr 18 02:45:43.118471 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.118449 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 18 02:45:43.118595 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.118483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.118595 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.118513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.121980 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.121958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.124313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.124184 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qb6fj\"" Apr 18 02:45:43.124313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.124200 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 18 02:45:43.124313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.124288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 18 02:45:43.124498 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.124362 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.124498 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.124403 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.126303 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.126189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.126303 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.126264 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:43.129538 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.129490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:43.129646 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.129575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:43.131718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.131699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.133087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.133070 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.134667 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.134647 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqqp8\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135200 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135251 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135320 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135357 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135366 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135468 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135488 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 18 02:45:43.135624 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135546 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6mdf7\"" Apr 18 02:45:43.136125 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.135813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.136176 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.136160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.136775 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.136754 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.136775 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.136769 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.136930 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.136809 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqngg\"" Apr 18 02:45:43.138192 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.138171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.138592 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.138566 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 18 02:45:43.138703 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.138663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 18 02:45:43.138762 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.138711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-klbqc\"" Apr 18 02:45:43.140251 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.140233 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.140790 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.140772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lv478\"" Apr 18 02:45:43.141396 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.140782 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:45:43.141479 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.141427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 18 02:45:43.141479 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.140811 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 18 02:45:43.142047 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.142016 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:40:42 +0000 UTC" deadline="2028-01-02 10:43:27.071545146 +0000 UTC" Apr 18 02:45:43.142135 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.142050 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14983h57m43.929501162s" Apr 18 02:45:43.142698 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.142675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gsrlj\"" Apr 18 02:45:43.142979 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.142960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 18 02:45:43.143249 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.143233 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 18 02:45:43.204376 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.204352 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 18 02:45:43.212452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-etc-selinux\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.212589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-system-cni-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.212589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/681e4af6-1800-4683-b967-ed0bb5c1f635-agent-certs\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.212589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-modprobe-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.212589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-kubernetes\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-tmp\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cnibin\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5888\" (UniqueName: \"kubernetes.io/projected/06d32427-f8ec-4151-bb49-8eaef8308f79-kube-api-access-s5888\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-var-lib-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-socket-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.212806 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-registration-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-kubelet\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-slash\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-sys-fs\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysconfig\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-run\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3505841-3e2c-4c80-b017-3f4e6f96512d-iptables-alerter-script\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-kubelet\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.212998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-bin\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-config\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03893d3-b1b4-4eac-93ff-24aa692c02ec-serviceca\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/681e4af6-1800-4683-b967-ed0bb5c1f635-konnectivity-ca\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.213151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-cni-binary-copy\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-socket-dir-parent\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-node-log\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-systemd\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-host\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzld\" (UniqueName: \"kubernetes.io/projected/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-kube-api-access-rtzld\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxxdr\" (UniqueName: \"kubernetes.io/projected/fb81bf64-9170-43bc-a130-f32b09124d60-kube-api-access-jxxdr\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-lib-modules\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-etc-tuned\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5v5\" (UniqueName: \"kubernetes.io/projected/f3505841-3e2c-4c80-b017-3f4e6f96512d-kube-api-access-xl5v5\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqnz\" (UniqueName: \"kubernetes.io/projected/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kube-api-access-zqqnz\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-k8s-cni-cncf-io\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-conf-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a997b9-1bc8-4876-8383-936b5403b5e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.213718 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-sys\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-ovn\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-script-lib\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-conf\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-hostroot\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-multus-certs\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-log-socket\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-netd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-env-overrides\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-var-lib-kubelet\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.213979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-system-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-cnibin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-netns\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-bin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.214312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-systemd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-etc-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-device-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-netns\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214220 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95q5g\" (UniqueName: \"kubernetes.io/projected/a03893d3-b1b4-4eac-93ff-24aa692c02ec-kube-api-access-95q5g\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3505841-3e2c-4c80-b017-3f4e6f96512d-host-slash\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-os-release\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-multus\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03893d3-b1b4-4eac-93ff-24aa692c02ec-host\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-os-release\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-multus-daemon-config\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-etc-kubernetes\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-systemd-units\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.215093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjxml\" (UniqueName: \"kubernetes.io/projected/503428d7-6339-49da-8858-3fa5446cf42d-kube-api-access-gjxml\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.215822 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.214605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl46l\" (UniqueName: \"kubernetes.io/projected/79a997b9-1bc8-4876-8383-936b5403b5e5-kube-api-access-kl46l\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.314866 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-var-lib-kubelet\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.315041 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-system-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.315041 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-cnibin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.315041 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-netns\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.315180 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-netns\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.315261 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-cnibin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.315325 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-var-lib-kubelet\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.315364 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-system-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-bin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.314971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-bin\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-systemd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-etc-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-device-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-systemd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316087 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-etc-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.315755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-device-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-netns\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95q5g\" (UniqueName: \"kubernetes.io/projected/a03893d3-b1b4-4eac-93ff-24aa692c02ec-kube-api-access-95q5g\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3505841-3e2c-4c80-b017-3f4e6f96512d-host-slash\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-os-release\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-multus\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03893d3-b1b4-4eac-93ff-24aa692c02ec-host\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-os-release\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-multus-daemon-config\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.316567 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-etc-kubernetes\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-systemd-units\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjxml\" (UniqueName: \"kubernetes.io/projected/503428d7-6339-49da-8858-3fa5446cf42d-kube-api-access-gjxml\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl46l\" (UniqueName: \"kubernetes.io/projected/79a997b9-1bc8-4876-8383-936b5403b5e5-kube-api-access-kl46l\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-etc-selinux\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-system-cni-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/681e4af6-1800-4683-b967-ed0bb5c1f635-agent-certs\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-modprobe-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-kubernetes\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-tmp\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cnibin\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5888\" (UniqueName: \"kubernetes.io/projected/06d32427-f8ec-4151-bb49-8eaef8308f79-kube-api-access-s5888\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-var-lib-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.316988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-socket-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-registration-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.317213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-kubelet\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-slash\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-sys-fs\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysconfig\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-run\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3505841-3e2c-4c80-b017-3f4e6f96512d-iptables-alerter-script\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-kubelet\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-bin\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-config\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03893d3-b1b4-4eac-93ff-24aa692c02ec-serviceca\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/681e4af6-1800-4683-b967-ed0bb5c1f635-konnectivity-ca\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-cni-binary-copy\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-socket-dir-parent\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-node-log\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-systemd\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.317943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-host\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-system-cni-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-registration-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-socket-dir-parent\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-bin\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.317938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-kubelet\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-slash\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-sys-fs\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysconfig\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.318730 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-etc-kubernetes\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-cni-binary-copy\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-node-log\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.318986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-run-netns\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-modprobe-d\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.319167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a03893d3-b1b4-4eac-93ff-24aa692c02ec-serviceca\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-kubelet\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/681e4af6-1800-4683-b967-ed0bb5c1f635-konnectivity-ca\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-systemd\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-var-lib-openvswitch\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3505841-3e2c-4c80-b017-3f4e6f96512d-iptables-alerter-script\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-systemd-units\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.319335 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a03893d3-b1b4-4eac-93ff-24aa692c02ec-host\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cnibin\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-run\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-os-release\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.319510 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.319499 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:43.819402291 +0000 UTC m=+3.325864566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzld\" (UniqueName: \"kubernetes.io/projected/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-kube-api-access-rtzld\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-kubernetes\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxxdr\" (UniqueName: \"kubernetes.io/projected/fb81bf64-9170-43bc-a130-f32b09124d60-kube-api-access-jxxdr\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-socket-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-lib-modules\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319875 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-lib-modules\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-etc-tuned\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.319975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3505841-3e2c-4c80-b017-3f4e6f96512d-host-slash\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5v5\" (UniqueName: \"kubernetes.io/projected/f3505841-3e2c-4c80-b017-3f4e6f96512d-kube-api-access-xl5v5\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqnz\" (UniqueName: \"kubernetes.io/projected/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kube-api-access-zqqnz\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.320070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-k8s-cni-cncf-io\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-conf-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb81bf64-9170-43bc-a130-f32b09124d60-multus-daemon-config\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a997b9-1bc8-4876-8383-936b5403b5e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-sys\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-host\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-ovn\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-script-lib\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-conf\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.320653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-var-lib-cni-multus\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-os-release\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-etc-sysctl-conf\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-run-ovn\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-k8s-cni-cncf-io\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-hostroot\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-multus-certs\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-etc-selinux\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-log-socket\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-netd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-hostroot\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.320899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-config\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-cni-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/503428d7-6339-49da-8858-3fa5446cf42d-sys\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.321370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-host-run-multus-certs\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-ovnkube-script-lib\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-env-overrides\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-log-socket\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb81bf64-9170-43bc-a130-f32b09124d60-multus-conf-dir\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.321588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79a997b9-1bc8-4876-8383-936b5403b5e5-host-cni-netd\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.322080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79a997b9-1bc8-4876-8383-936b5403b5e5-env-overrides\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.322099 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.322093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.323932 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.323900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-tmp\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.324370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.324296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/503428d7-6339-49da-8858-3fa5446cf42d-etc-tuned\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.324370 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.324312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/681e4af6-1800-4683-b967-ed0bb5c1f635-agent-certs\") pod \"konnectivity-agent-c22bt\" (UID: \"681e4af6-1800-4683-b967-ed0bb5c1f635\") " pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.324501 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.324370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79a997b9-1bc8-4876-8383-936b5403b5e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.327133 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.326862 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:43.327133 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.326884 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:43.327133 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.326899 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:43.327133 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.326972 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:45:43.826951593 +0000 UTC m=+3.333413852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:43.328294 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.328229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5888\" (UniqueName: \"kubernetes.io/projected/06d32427-f8ec-4151-bb49-8eaef8308f79-kube-api-access-s5888\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.329413 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.329370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxxdr\" (UniqueName: \"kubernetes.io/projected/fb81bf64-9170-43bc-a130-f32b09124d60-kube-api-access-jxxdr\") pod \"multus-ckptk\" (UID: \"fb81bf64-9170-43bc-a130-f32b09124d60\") " pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.329790 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.329771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzld\" (UniqueName: \"kubernetes.io/projected/34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b-kube-api-access-rtzld\") pod \"multus-additional-cni-plugins-rcpsj\" (UID: \"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b\") " pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.330165 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.330115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5v5\" (UniqueName: \"kubernetes.io/projected/f3505841-3e2c-4c80-b017-3f4e6f96512d-kube-api-access-xl5v5\") pod \"iptables-alerter-lb7px\" (UID: \"f3505841-3e2c-4c80-b017-3f4e6f96512d\") " pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.330575 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.330549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95q5g\" (UniqueName: \"kubernetes.io/projected/a03893d3-b1b4-4eac-93ff-24aa692c02ec-kube-api-access-95q5g\") pod \"node-ca-kbc68\" (UID: \"a03893d3-b1b4-4eac-93ff-24aa692c02ec\") " pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.330788 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.330745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjxml\" (UniqueName: \"kubernetes.io/projected/503428d7-6339-49da-8858-3fa5446cf42d-kube-api-access-gjxml\") pod \"tuned-wv9n8\" (UID: \"503428d7-6339-49da-8858-3fa5446cf42d\") " pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.331529 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.331508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqnz\" (UniqueName: \"kubernetes.io/projected/7812fd91-413a-4b89-aecd-dcfe89d98dd7-kube-api-access-zqqnz\") pod \"aws-ebs-csi-driver-node-49jws\" (UID: \"7812fd91-413a-4b89-aecd-dcfe89d98dd7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.332009 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.331936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl46l\" (UniqueName: \"kubernetes.io/projected/79a997b9-1bc8-4876-8383-936b5403b5e5-kube-api-access-kl46l\") pod \"ovnkube-node-lsz2j\" (UID: \"79a997b9-1bc8-4876-8383-936b5403b5e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.379455 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.379426 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:45:43.427645 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.427603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kbc68" Apr 18 02:45:43.433254 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.433232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ckptk" Apr 18 02:45:43.444876 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.444855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:45:43.451514 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.451485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" Apr 18 02:45:43.458113 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.458093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" Apr 18 02:45:43.465686 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.465668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" Apr 18 02:45:43.474222 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.474203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lb7px" Apr 18 02:45:43.479729 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.479711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:45:43.731454 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.731424 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod681e4af6_1800_4683_b967_ed0bb5c1f635.slice/crio-28035be5c290ea7cd66dfc5b4c8e93fb570b6e964711070cea5c8f71ea499619 WatchSource:0}: Error finding container 28035be5c290ea7cd66dfc5b4c8e93fb570b6e964711070cea5c8f71ea499619: Status 404 returned error can't find the container with id 28035be5c290ea7cd66dfc5b4c8e93fb570b6e964711070cea5c8f71ea499619 Apr 18 02:45:43.732500 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.732467 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3505841_3e2c_4c80_b017_3f4e6f96512d.slice/crio-3270b1484098ace8ed288eeadead08c7265e2ca6700da990b5cbde8bc7a8585a WatchSource:0}: Error finding container 3270b1484098ace8ed288eeadead08c7265e2ca6700da990b5cbde8bc7a8585a: Status 404 returned error can't find the container with id 3270b1484098ace8ed288eeadead08c7265e2ca6700da990b5cbde8bc7a8585a Apr 18 02:45:43.734038 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.733891 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb81bf64_9170_43bc_a130_f32b09124d60.slice/crio-b269e12e09c11f781241f4cc36c73cda555ddb9dca3d3830bdb172090316d660 WatchSource:0}: Error finding container b269e12e09c11f781241f4cc36c73cda555ddb9dca3d3830bdb172090316d660: Status 404 returned error can't find the container with id b269e12e09c11f781241f4cc36c73cda555ddb9dca3d3830bdb172090316d660 Apr 18 02:45:43.757182 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.757158 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod503428d7_6339_49da_8858_3fa5446cf42d.slice/crio-642458d589649b18b70c2928cce6a59c70f87532f65901c13120119f37afec6a WatchSource:0}: Error finding container 642458d589649b18b70c2928cce6a59c70f87532f65901c13120119f37afec6a: Status 404 returned error can't find the container with id 642458d589649b18b70c2928cce6a59c70f87532f65901c13120119f37afec6a Apr 18 02:45:43.758045 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.758019 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79a997b9_1bc8_4876_8383_936b5403b5e5.slice/crio-2e9d1acb0583883a5386021393b30910292e95aec067da8806a5fe4e987e9a68 WatchSource:0}: Error finding container 2e9d1acb0583883a5386021393b30910292e95aec067da8806a5fe4e987e9a68: Status 404 returned error can't find the container with id 2e9d1acb0583883a5386021393b30910292e95aec067da8806a5fe4e987e9a68 Apr 18 02:45:43.759687 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.759658 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e9beab_b2dc_42ba_a4ca_f08fa85d4f7b.slice/crio-1f3334a6da4f46b7048551d74eeed7bfbfa917e486c99d45f82470e092c6d6d5 WatchSource:0}: Error finding container 1f3334a6da4f46b7048551d74eeed7bfbfa917e486c99d45f82470e092c6d6d5: Status 404 returned error can't find the container with id 1f3334a6da4f46b7048551d74eeed7bfbfa917e486c99d45f82470e092c6d6d5 Apr 18 02:45:43.760449 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.760382 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7812fd91_413a_4b89_aecd_dcfe89d98dd7.slice/crio-c72def6699b38bfa973476aa682177d47da170c78f8a2cfc40c42c58e8312015 WatchSource:0}: Error finding container c72def6699b38bfa973476aa682177d47da170c78f8a2cfc40c42c58e8312015: Status 404 returned error can't find the container with id c72def6699b38bfa973476aa682177d47da170c78f8a2cfc40c42c58e8312015 Apr 18 02:45:43.761189 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:45:43.761136 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03893d3_b1b4_4eac_93ff_24aa692c02ec.slice/crio-f0f3db93f6a0891e050529a02e3b94f239c1c2d4a2f57b61f1336fbb79515563 WatchSource:0}: Error finding container f0f3db93f6a0891e050529a02e3b94f239c1c2d4a2f57b61f1336fbb79515563: Status 404 returned error can't find the container with id f0f3db93f6a0891e050529a02e3b94f239c1c2d4a2f57b61f1336fbb79515563 Apr 18 02:45:43.824914 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.824890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:43.825014 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.824990 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:43.825062 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.825033 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:44.825021975 +0000 UTC m=+4.331484226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:43.925804 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:43.925774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:43.925971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.925932 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:43.925971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.925954 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:43.925971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.925963 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:43.926075 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:43.926014 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:45:44.925997539 +0000 UTC m=+4.432459802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:44.142824 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.142714 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:40:42 +0000 UTC" deadline="2027-12-31 17:50:32.229123068 +0000 UTC" Apr 18 02:45:44.142824 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.142754 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14943h4m48.086373179s" Apr 18 02:45:44.180763 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.180710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" event={"ID":"881cc9f87b49496f030449959b9b3b21","Type":"ContainerStarted","Data":"ca311f45f47c07cb0e2933ae5343e3d172bb32406b8e4a16ced8188063934beb"} Apr 18 02:45:44.182300 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.182274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kbc68" event={"ID":"a03893d3-b1b4-4eac-93ff-24aa692c02ec","Type":"ContainerStarted","Data":"f0f3db93f6a0891e050529a02e3b94f239c1c2d4a2f57b61f1336fbb79515563"} Apr 18 02:45:44.183880 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.183855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerStarted","Data":"1f3334a6da4f46b7048551d74eeed7bfbfa917e486c99d45f82470e092c6d6d5"} Apr 18 02:45:44.185331 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.185309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" event={"ID":"503428d7-6339-49da-8858-3fa5446cf42d","Type":"ContainerStarted","Data":"642458d589649b18b70c2928cce6a59c70f87532f65901c13120119f37afec6a"} Apr 18 02:45:44.186947 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.186908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ckptk" event={"ID":"fb81bf64-9170-43bc-a130-f32b09124d60","Type":"ContainerStarted","Data":"b269e12e09c11f781241f4cc36c73cda555ddb9dca3d3830bdb172090316d660"} Apr 18 02:45:44.188804 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.188774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lb7px" event={"ID":"f3505841-3e2c-4c80-b017-3f4e6f96512d","Type":"ContainerStarted","Data":"3270b1484098ace8ed288eeadead08c7265e2ca6700da990b5cbde8bc7a8585a"} Apr 18 02:45:44.190226 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.190201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c22bt" event={"ID":"681e4af6-1800-4683-b967-ed0bb5c1f635","Type":"ContainerStarted","Data":"28035be5c290ea7cd66dfc5b4c8e93fb570b6e964711070cea5c8f71ea499619"} Apr 18 02:45:44.191384 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.191356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" event={"ID":"7812fd91-413a-4b89-aecd-dcfe89d98dd7","Type":"ContainerStarted","Data":"c72def6699b38bfa973476aa682177d47da170c78f8a2cfc40c42c58e8312015"} Apr 18 02:45:44.193597 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.193574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"2e9d1acb0583883a5386021393b30910292e95aec067da8806a5fe4e987e9a68"} Apr 18 02:45:44.834163 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.834075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:44.834337 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.834221 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:44.834337 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.834284 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:46.834264569 +0000 UTC m=+6.340726828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:44.934877 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:44.934840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:44.935047 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.934994 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:44.935047 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.935009 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:44.935047 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.935020 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:44.935259 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:44.935071 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:45:46.935050328 +0000 UTC m=+6.441512585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:45.172375 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:45.171800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:45.172375 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:45.171845 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:45.172375 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:45.171946 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:45.172375 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:45.172065 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:45.204474 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:45.204434 2575 generic.go:358] "Generic (PLEG): container finished" podID="e2bcc8811e31056862bf836a7191d135" containerID="42bc25d5ed1710051cdafd7043a3459c706bd55c2e124839abd5fb40d384e99f" exitCode=0 Apr 18 02:45:45.205452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:45.205421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" event={"ID":"e2bcc8811e31056862bf836a7191d135","Type":"ContainerDied","Data":"42bc25d5ed1710051cdafd7043a3459c706bd55c2e124839abd5fb40d384e99f"} Apr 18 02:45:45.219023 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:45.217814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-103.ec2.internal" podStartSLOduration=3.217796214 podStartE2EDuration="3.217796214s" podCreationTimestamp="2026-04-18 02:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:45:44.193211232 +0000 UTC m=+3.699673505" watchObservedRunningTime="2026-04-18 02:45:45.217796214 +0000 UTC m=+4.724258489" Apr 18 02:45:46.209607 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:46.209560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" event={"ID":"e2bcc8811e31056862bf836a7191d135","Type":"ContainerStarted","Data":"84606d2186c24ec47b4d97c2b63b0e951b09519a46b09b83e9ffa920c8253cfe"} Apr 18 02:45:46.852659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:46.852110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:46.852659 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.852249 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:46.852659 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.852312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:50.852291807 +0000 UTC m=+10.358754061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:46.953015 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:46.952976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:46.953202 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.953186 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:46.953276 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.953209 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:46.953276 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.953223 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:46.953390 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:46.953286 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:45:50.953265607 +0000 UTC m=+10.459727866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:47.175024 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:47.174946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:47.175170 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:47.175056 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:47.175503 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:47.175480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:47.175687 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:47.175583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:49.174884 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:49.174850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:49.175378 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:49.174975 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:49.175378 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:49.175047 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:49.175378 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:49.175137 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:50.883602 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:50.883562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:50.884104 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.883760 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:50.884104 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.883837 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:45:58.883816283 +0000 UTC m=+18.390278550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:50.984436 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:50.984388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:50.984641 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.984576 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:50.984641 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.984598 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:50.984641 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.984609 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:50.984778 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:50.984685 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:45:58.984665743 +0000 UTC m=+18.491127994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:51.172284 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:51.172212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:51.172418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:51.172331 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:51.172987 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:51.172808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:51.172987 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:51.172940 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:53.171675 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:53.171623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:53.172132 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:53.171783 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:53.172132 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:53.171924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:53.172132 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:53.171996 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:55.171178 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:55.171139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:55.171614 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:55.171146 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:55.171614 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:55.171267 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:55.171614 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:55.171369 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:57.171389 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.171131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:57.171830 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.171180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:57.171830 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:57.171506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:57.171830 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:57.171646 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:57.552914 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.552849 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-103.ec2.internal" podStartSLOduration=15.552832972000001 podStartE2EDuration="15.552832972s" podCreationTimestamp="2026-04-18 02:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:45:46.222832575 +0000 UTC m=+5.729294850" watchObservedRunningTime="2026-04-18 02:45:57.552832972 +0000 UTC m=+17.059295237" Apr 18 02:45:57.553597 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.553569 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-66knh"] Apr 18 02:45:57.578174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.578139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.578332 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:57.578228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:45:57.628993 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.628954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.629166 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.629020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-kubelet-config\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.629166 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.629044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-dbus\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730367 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.730337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730554 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.730378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-kubelet-config\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730554 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.730394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-dbus\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730554 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.730479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-kubelet-config\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730554 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:57.730517 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:45:57.730775 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:57.730597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3797aadb-3c8b-4eda-9fbe-dce172b0710e-dbus\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:57.730775 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:57.730595 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:45:58.230574521 +0000 UTC m=+17.737036773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:45:58.233734 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:58.233620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:58.234158 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:58.233764 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:45:58.234158 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:58.233836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:45:59.23381427 +0000 UTC m=+18.740276534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:45:58.940647 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:58.940602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:58.940823 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:58.940766 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:58.940879 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:58.940847 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:14.940822876 +0000 UTC m=+34.447285140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:45:59.041420 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:59.041384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:59.041574 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.041557 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:45:59.041611 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.041578 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:45:59.041611 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.041589 2575 projected.go:194] Error preparing data for projected volume kube-api-access-lhdj4 for pod openshift-network-diagnostics/network-check-target-kd89b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:59.041721 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.041667 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4 podName:bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.041647853 +0000 UTC m=+34.548110126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lhdj4" (UniqueName: "kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4") pod "network-check-target-kd89b" (UID: "bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:45:59.171236 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:59.171196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:45:59.171236 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:59.171227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:59.171497 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:59.171196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:45:59.171497 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.171334 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:45:59.171497 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.171420 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:45:59.171676 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.171553 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:45:59.242229 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:45:59.242139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:45:59.242615 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.242298 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:45:59.242615 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:45:59.242383 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:46:01.242359705 +0000 UTC m=+20.748821967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:01.172135 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.171865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:01.173020 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.171934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:01.173020 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:01.172170 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:01.173020 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.171972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:01.173020 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:01.172266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:01.173020 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:01.172357 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:01.237136 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"26c2c0807f254ead2300e6dd104b93b1e6600305b6abbe8ad053d27e5bf10235"} Apr 18 02:46:01.237136 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"9dd0f1bf38045707a083c3b7071fe6e59fc895c293435e41751c950982d8e830"} Apr 18 02:46:01.237372 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"be6326035f828f955dc3c13c1df02376bdc53caa3f6a82a25cc71aeada89cb16"} Apr 18 02:46:01.237372 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"8497909fddd497ccb7a10c1123b8a0c11c7e2a6ad564f89c6c11918687ecc332"} Apr 18 02:46:01.237372 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"71f63d31008ca6df3a70f44283e261831992187699a7a729a89fc51c67037dbc"} Apr 18 02:46:01.237372 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.237181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"be142350acaadb0c69b3fc84c709e17ec9abca0abc5a82f30bf7a45c459aa6a7"} Apr 18 02:46:01.238478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.238445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kbc68" event={"ID":"a03893d3-b1b4-4eac-93ff-24aa692c02ec","Type":"ContainerStarted","Data":"9ae31ac2a8368c09991ca029cb52c78cc97f6e23c0edda11c0d22f376e76e4fe"} Apr 18 02:46:01.239949 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.239917 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="77ca75a50c8472d069f896b3cddf150f13ebe4ae074740f8ac48ba0b6e8481e4" exitCode=0 Apr 18 02:46:01.240061 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.239997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"77ca75a50c8472d069f896b3cddf150f13ebe4ae074740f8ac48ba0b6e8481e4"} Apr 18 02:46:01.241488 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.241408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" event={"ID":"503428d7-6339-49da-8858-3fa5446cf42d","Type":"ContainerStarted","Data":"601df1b7338b9cdd15ba489844c68c935ade0c70ddc4f377b301251e59531c2e"} Apr 18 02:46:01.243155 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.243121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ckptk" event={"ID":"fb81bf64-9170-43bc-a130-f32b09124d60","Type":"ContainerStarted","Data":"bebbaaeae8c895a9458461afc4d9d6d4db96f2eac84cd40482e57226559163c6"} Apr 18 02:46:01.244595 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.244575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c22bt" event={"ID":"681e4af6-1800-4683-b967-ed0bb5c1f635","Type":"ContainerStarted","Data":"b0ae7fc040c42e3e5b74a39eabd54ef9fa12049b5fef25412081c1c03282eaea"} Apr 18 02:46:01.245904 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.245882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" event={"ID":"7812fd91-413a-4b89-aecd-dcfe89d98dd7","Type":"ContainerStarted","Data":"4beaf73ab887e3bedc2c492bf30649136b6a43bbfa1eaa163a5308938d827760"} Apr 18 02:46:01.251026 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.250982 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kbc68" podStartSLOduration=3.6677931729999997 podStartE2EDuration="20.250969341s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.763364607 +0000 UTC m=+3.269826864" lastFinishedPulling="2026-04-18 02:46:00.34654077 +0000 UTC m=+19.853003032" observedRunningTime="2026-04-18 02:46:01.250788135 +0000 UTC m=+20.757250413" watchObservedRunningTime="2026-04-18 02:46:01.250969341 +0000 UTC m=+20.757431653" Apr 18 02:46:01.255353 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.255330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:01.255456 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:01.255438 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:01.255580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:01.255498 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:46:05.255482488 +0000 UTC m=+24.761944740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:01.261046 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.261007 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c22bt" podStartSLOduration=3.649062566 podStartE2EDuration="20.260992258s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.734611697 +0000 UTC m=+3.241073951" lastFinishedPulling="2026-04-18 02:46:00.346541378 +0000 UTC m=+19.853003643" observedRunningTime="2026-04-18 02:46:01.260539939 +0000 UTC m=+20.767002215" watchObservedRunningTime="2026-04-18 02:46:01.260992258 +0000 UTC m=+20.767454533" Apr 18 02:46:01.275379 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.275330 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ckptk" podStartSLOduration=3.522010393 podStartE2EDuration="20.275317259s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.755446056 +0000 UTC m=+3.261908312" lastFinishedPulling="2026-04-18 02:46:00.50875291 +0000 UTC m=+20.015215178" observedRunningTime="2026-04-18 02:46:01.275112835 +0000 UTC m=+20.781575122" watchObservedRunningTime="2026-04-18 02:46:01.275317259 +0000 UTC m=+20.781779536" Apr 18 02:46:01.286818 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.286758 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wv9n8" podStartSLOduration=3.699062985 podStartE2EDuration="20.286743761s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.758872063 +0000 UTC m=+3.265334329" lastFinishedPulling="2026-04-18 02:46:00.346552853 +0000 UTC m=+19.853015105" observedRunningTime="2026-04-18 02:46:01.286607606 +0000 UTC m=+20.793069880" watchObservedRunningTime="2026-04-18 02:46:01.286743761 +0000 UTC m=+20.793206058" Apr 18 02:46:01.512756 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:01.512733 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 18 02:46:02.172398 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.172260 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-18T02:46:01.512749529Z","UUID":"9d85808c-9025-48f1-8947-a79d60373e19","Handler":null,"Name":"","Endpoint":""} Apr 18 02:46:02.175892 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.175859 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 18 02:46:02.176039 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.175899 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 18 02:46:02.249864 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.249815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lb7px" event={"ID":"f3505841-3e2c-4c80-b017-3f4e6f96512d","Type":"ContainerStarted","Data":"6a7376c1c47e40dc72c2fbe21f8a7996a195739ae96310d01807f1bb66cf7ff3"} Apr 18 02:46:02.252152 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.252102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" event={"ID":"7812fd91-413a-4b89-aecd-dcfe89d98dd7","Type":"ContainerStarted","Data":"a68d8cfbdd57d3adef7941c49143eac7d56e340df567258367e5f21bb0175356"} Apr 18 02:46:02.262007 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:02.261946 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lb7px" podStartSLOduration=4.649952053 podStartE2EDuration="21.26192808s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.734693425 +0000 UTC m=+3.241155681" lastFinishedPulling="2026-04-18 02:46:00.346669443 +0000 UTC m=+19.853131708" observedRunningTime="2026-04-18 02:46:02.260939802 +0000 UTC m=+21.767402076" watchObservedRunningTime="2026-04-18 02:46:02.26192808 +0000 UTC m=+21.768390355" Apr 18 02:46:03.171333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.171295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:03.171333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.171321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:03.171615 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.171295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:03.171615 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:03.171414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:03.171615 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:03.171473 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:03.171615 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:03.171546 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:03.256680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.256571 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" event={"ID":"7812fd91-413a-4b89-aecd-dcfe89d98dd7","Type":"ContainerStarted","Data":"ad087afe170f8338ce34bb1e2019696ecf710dfbe034b06e14ab60841daa896b"} Apr 18 02:46:03.259967 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.259934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"2e62ae348804ab1873c5c1435d21fe572c6ccef339bfc19ca777f03b3c657636"} Apr 18 02:46:03.281477 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:03.281415 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-49jws" podStartSLOduration=3.624817753 podStartE2EDuration="22.281394417s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.763448487 +0000 UTC m=+3.269910742" lastFinishedPulling="2026-04-18 02:46:02.420025151 +0000 UTC m=+21.926487406" observedRunningTime="2026-04-18 02:46:03.281296696 +0000 UTC m=+22.787758973" watchObservedRunningTime="2026-04-18 02:46:03.281394417 +0000 UTC m=+22.787856692" Apr 18 02:46:04.875152 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:04.875119 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5nsl5"] Apr 18 02:46:04.935402 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:04.935362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:04.937995 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:04.937969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g2fpl\"" Apr 18 02:46:04.938152 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:04.938023 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 18 02:46:04.938152 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:04.938024 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 18 02:46:05.084615 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.084560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36d98916-52f7-469a-a004-8f19f9739057-hosts-file\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.084802 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.084689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d69\" (UniqueName: \"kubernetes.io/projected/36d98916-52f7-469a-a004-8f19f9739057-kube-api-access-22d69\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.084802 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.084775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/36d98916-52f7-469a-a004-8f19f9739057-tmp-dir\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.171518 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.171489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:05.171687 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.171487 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:05.171687 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:05.171591 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:05.171802 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:05.171701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:05.171802 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.171490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:05.171892 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:05.171820 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:05.181082 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.181055 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:46:05.181992 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.181972 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:46:05.186052 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.186034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36d98916-52f7-469a-a004-8f19f9739057-hosts-file\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.186108 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.186065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22d69\" (UniqueName: \"kubernetes.io/projected/36d98916-52f7-469a-a004-8f19f9739057-kube-api-access-22d69\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.186151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.186105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/36d98916-52f7-469a-a004-8f19f9739057-tmp-dir\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.186186 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.186147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36d98916-52f7-469a-a004-8f19f9739057-hosts-file\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.186349 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.186332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/36d98916-52f7-469a-a004-8f19f9739057-tmp-dir\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.193836 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.193683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d69\" (UniqueName: \"kubernetes.io/projected/36d98916-52f7-469a-a004-8f19f9739057-kube-api-access-22d69\") pod \"node-resolver-5nsl5\" (UID: \"36d98916-52f7-469a-a004-8f19f9739057\") " pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.245221 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.245128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5nsl5" Apr 18 02:46:05.256848 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:46:05.256602 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d98916_52f7_469a_a004_8f19f9739057.slice/crio-a9a85180d8148772b01f7a47f3845984565dbd5dcd7e4a23171c764326ab1305 WatchSource:0}: Error finding container a9a85180d8148772b01f7a47f3845984565dbd5dcd7e4a23171c764326ab1305: Status 404 returned error can't find the container with id a9a85180d8148772b01f7a47f3845984565dbd5dcd7e4a23171c764326ab1305 Apr 18 02:46:05.269561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.268525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" event={"ID":"79a997b9-1bc8-4876-8383-936b5403b5e5","Type":"ContainerStarted","Data":"46a54cb2d27b5f154c1e280d17ad0d4ae1fd481b51516e9933b561710b253844"} Apr 18 02:46:05.269561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.269294 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:05.269561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.269324 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:05.269561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.269338 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:05.272142 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.271292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5nsl5" event={"ID":"36d98916-52f7-469a-a004-8f19f9739057","Type":"ContainerStarted","Data":"a9a85180d8148772b01f7a47f3845984565dbd5dcd7e4a23171c764326ab1305"} Apr 18 02:46:05.287529 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.287237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:05.287529 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:05.287406 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:05.287529 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:05.287488 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:46:13.287467648 +0000 UTC m=+32.793929917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:05.289225 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.289141 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:05.289323 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.289301 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:05.318243 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:05.318149 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" podStartSLOduration=7.672480346 podStartE2EDuration="24.318131373s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.763438045 +0000 UTC m=+3.269900315" lastFinishedPulling="2026-04-18 02:46:00.40908909 +0000 UTC m=+19.915551342" observedRunningTime="2026-04-18 02:46:05.293023612 +0000 UTC m=+24.799485887" watchObservedRunningTime="2026-04-18 02:46:05.318131373 +0000 UTC m=+24.824593649" Apr 18 02:46:06.273740 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:06.273704 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="a9d0fa6f6299df671aebf33675621ff18ec9012f8e570d7138d483c4c0225433" exitCode=0 Apr 18 02:46:06.274286 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:06.273799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"a9d0fa6f6299df671aebf33675621ff18ec9012f8e570d7138d483c4c0225433"} Apr 18 02:46:06.275160 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:06.275137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5nsl5" event={"ID":"36d98916-52f7-469a-a004-8f19f9739057","Type":"ContainerStarted","Data":"1248142d67e7c11544da2b0ddefb83b9421ba3cbfcbe4398d5e984a1db863dbd"} Apr 18 02:46:07.161944 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.161695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5nsl5" podStartSLOduration=3.161676323 podStartE2EDuration="3.161676323s" podCreationTimestamp="2026-04-18 02:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:06.307024329 +0000 UTC m=+25.813486602" watchObservedRunningTime="2026-04-18 02:46:07.161676323 +0000 UTC m=+26.668138599" Apr 18 02:46:07.162500 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.162485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd89b"] Apr 18 02:46:07.162620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.162608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:07.162742 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:07.162722 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:07.164708 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.164684 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l8m94"] Apr 18 02:46:07.164801 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.164795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:07.164891 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:07.164874 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:07.171839 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.171813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:07.172006 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:07.171909 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:07.175884 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.175857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66knh"] Apr 18 02:46:07.279124 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.279084 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="db72777743930345c6a209b3bcdb044ca9a6d3ab245689acdf8c4beaaff855ed" exitCode=0 Apr 18 02:46:07.279593 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.279174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:07.279593 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:07.279201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"db72777743930345c6a209b3bcdb044ca9a6d3ab245689acdf8c4beaaff855ed"} Apr 18 02:46:07.279593 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:07.279264 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:08.012704 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:08.012670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:46:08.012833 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:08.012819 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 18 02:46:08.013258 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:08.013244 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c22bt" Apr 18 02:46:08.283221 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:08.283188 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="1be05042a182fc5ab2a7d0005cbe070db7ec4e27d183b755256eea0f8bf0deed" exitCode=0 Apr 18 02:46:08.283666 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:08.283267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"1be05042a182fc5ab2a7d0005cbe070db7ec4e27d183b755256eea0f8bf0deed"} Apr 18 02:46:09.171050 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:09.171017 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:09.171050 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:09.171035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:09.171225 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:09.171072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:09.171225 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:09.171164 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:09.171512 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:09.171469 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:09.171642 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:09.171512 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:11.172112 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:11.172076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:11.172971 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:11.172175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:11.172971 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:11.172207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:11.172971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:11.172216 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:11.172971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:11.172278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:11.172971 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:11.172346 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:13.171339 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.171045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:13.171782 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.171048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:13.171782 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.171389 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:46:13.171782 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.171445 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kd89b" podUID="bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce" Apr 18 02:46:13.171782 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.171048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:13.171782 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.171548 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66knh" podUID="3797aadb-3c8b-4eda-9fbe-dce172b0710e" Apr 18 02:46:13.290925 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.290890 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-103.ec2.internal" event="NodeReady" Apr 18 02:46:13.291090 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.291060 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 18 02:46:13.324798 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.324766 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f46fc6d96-m2499"] Apr 18 02:46:13.352161 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.352103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:13.352348 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.352279 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:13.352406 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.352349 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret podName:3797aadb-3c8b-4eda-9fbe-dce172b0710e nodeName:}" failed. No retries permitted until 2026-04-18 02:46:29.352329881 +0000 UTC m=+48.858792149 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret") pod "global-pull-secret-syncer-66knh" (UID: "3797aadb-3c8b-4eda-9fbe-dce172b0710e") : object "kube-system"/"original-pull-secret" not registered Apr 18 02:46:13.353919 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.353894 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-svnzk"] Apr 18 02:46:13.354084 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.354066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.356494 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.356395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxh9b\"" Apr 18 02:46:13.356494 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.356439 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 18 02:46:13.356494 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.356442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 18 02:46:13.356828 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.356807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 18 02:46:13.363361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.363346 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 18 02:46:13.375231 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.375209 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r"] Apr 18 02:46:13.375330 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.375245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.377679 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.377537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 18 02:46:13.377679 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.377547 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 18 02:46:13.377679 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.377543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:46:13.397063 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.397043 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs"] Apr 18 02:46:13.397213 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.397196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" Apr 18 02:46:13.399925 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.399349 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 18 02:46:13.399925 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.399445 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rld8r\"" Apr 18 02:46:13.399925 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.399544 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 18 02:46:13.417733 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.417705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f46fc6d96-m2499"] Apr 18 02:46:13.417879 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.417808 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-znc79"] Apr 18 02:46:13.417879 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.417822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.420363 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.420302 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xds92\"" Apr 18 02:46:13.420363 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.420313 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 18 02:46:13.420363 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.420344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 18 02:46:13.435804 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.435783 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-svnzk"] Apr 18 02:46:13.435931 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.435818 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r"] Apr 18 02:46:13.435931 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.435832 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs"] Apr 18 02:46:13.435931 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.435845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-znc79"] Apr 18 02:46:13.435931 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.435924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.438361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.438341 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 18 02:46:13.438361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.438356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:46:13.438517 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.438393 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 18 02:46:13.438517 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.438428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 18 02:46:13.453173 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.453333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a58891-387a-4f51-a0bf-e2abf38cf891-config-volume\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.453520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgdw\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453738 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70a58891-387a-4f51-a0bf-e2abf38cf891-tmp-dir\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.453738 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.453738 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.453684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphpl\" (UniqueName: \"kubernetes.io/projected/70a58891-387a-4f51-a0bf-e2abf38cf891-kube-api-access-cphpl\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.554693 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.554858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.554858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.554858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a58891-387a-4f51-a0bf-e2abf38cf891-config-volume\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.554858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.554858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks8fq\" (UniqueName: \"kubernetes.io/projected/6e67776e-f785-425c-9c6b-56b4c10fccaa-kube-api-access-ks8fq\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgdw\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.554990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fpn\" (UniqueName: \"kubernetes.io/projected/5c725ae7-337b-4312-8646-37533fccfd6a-kube-api-access-n7fpn\") pod \"network-check-source-8894fc9bd-96n6r\" (UID: \"5c725ae7-337b-4312-8646-37533fccfd6a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" Apr 18 02:46:13.555122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70a58891-387a-4f51-a0bf-e2abf38cf891-tmp-dir\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cphpl\" (UniqueName: \"kubernetes.io/projected/70a58891-387a-4f51-a0bf-e2abf38cf891-kube-api-access-cphpl\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.555376 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:13.555401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a58891-387a-4f51-a0bf-e2abf38cf891-config-volume\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.555445 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:14.055428314 +0000 UTC m=+33.561890582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.555490 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.555505 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.555553 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:14.055535693 +0000 UTC m=+33.561997959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70a58891-387a-4f51-a0bf-e2abf38cf891-tmp-dir\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.555904 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.555753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.556527 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.556496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.559793 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.559774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.559889 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.559775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.564206 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.564183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cphpl\" (UniqueName: \"kubernetes.io/projected/70a58891-387a-4f51-a0bf-e2abf38cf891-kube-api-access-cphpl\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:13.565008 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.564967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.565535 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.565513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgdw\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:13.656585 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.656547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.656585 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.656594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.656848 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.656625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks8fq\" (UniqueName: \"kubernetes.io/projected/6e67776e-f785-425c-9c6b-56b4c10fccaa-kube-api-access-ks8fq\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.656848 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.656732 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:13.656848 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.656799 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:14.156777938 +0000 UTC m=+33.663240212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:13.657010 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.656916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.657010 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.656945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fpn\" (UniqueName: \"kubernetes.io/projected/5c725ae7-337b-4312-8646-37533fccfd6a-kube-api-access-n7fpn\") pod \"network-check-source-8894fc9bd-96n6r\" (UID: \"5c725ae7-337b-4312-8646-37533fccfd6a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" Apr 18 02:46:13.657114 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.657062 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:13.657169 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:13.657137 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:14.15711899 +0000 UTC m=+33.663581246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:13.657397 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.657375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:13.665536 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.665456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fpn\" (UniqueName: \"kubernetes.io/projected/5c725ae7-337b-4312-8646-37533fccfd6a-kube-api-access-n7fpn\") pod \"network-check-source-8894fc9bd-96n6r\" (UID: \"5c725ae7-337b-4312-8646-37533fccfd6a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" Apr 18 02:46:13.665670 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.665535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks8fq\" (UniqueName: \"kubernetes.io/projected/6e67776e-f785-425c-9c6b-56b4c10fccaa-kube-api-access-ks8fq\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:13.708542 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:13.708462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" Apr 18 02:46:14.061182 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.061143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:14.061361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.061279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:14.061361 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.061322 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:14.061361 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.061346 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:14.061505 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.061391 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:14.061505 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.061415 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.06139365 +0000 UTC m=+34.567855913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:14.061505 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.061443 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.061425851 +0000 UTC m=+34.567888103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:14.162562 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.162524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:14.162743 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.162577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:14.162743 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.162694 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:14.162826 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.162767 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.162750099 +0000 UTC m=+34.669212351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:14.162826 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.162704 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:14.162910 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.162830 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:15.162817941 +0000 UTC m=+34.669280200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:14.374153 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.374123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r"] Apr 18 02:46:14.377888 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:46:14.377857 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c725ae7_337b_4312_8646_37533fccfd6a.slice/crio-cd6394920947639f592ca39556123982d8260328df2d625d29bdbbb3d0679441 WatchSource:0}: Error finding container cd6394920947639f592ca39556123982d8260328df2d625d29bdbbb3d0679441: Status 404 returned error can't find the container with id cd6394920947639f592ca39556123982d8260328df2d625d29bdbbb3d0679441 Apr 18 02:46:14.969983 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:14.969900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:14.970152 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.970055 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:14.970152 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:14.970125 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:46.97010633 +0000 UTC m=+66.476568587 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:15.070858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.070650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.070815 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.070897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.070982 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:17.070959719 +0000 UTC m=+36.577421974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.070993 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.071008 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.071053 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:17.071040197 +0000 UTC m=+36.577502449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:15.071418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.071101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:15.075606 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.075581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhdj4\" (UniqueName: \"kubernetes.io/projected/bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce-kube-api-access-lhdj4\") pod \"network-check-target-kd89b\" (UID: \"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce\") " pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:15.170986 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.170947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:15.170986 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.170985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:15.171324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.170947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:15.171463 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.171445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:15.171520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.171504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:15.171639 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.171613 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:15.171763 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.171667 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:15.171763 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.171701 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:17.171683859 +0000 UTC m=+36.678146128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:15.171763 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:15.171716 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:17.17171015 +0000 UTC m=+36.678172402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:15.173788 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.173768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 18 02:46:15.174856 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.174599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nn98\"" Apr 18 02:46:15.174856 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.174607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 18 02:46:15.174856 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.174607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:46:15.190771 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.190753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:15.296910 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.296877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" event={"ID":"5c725ae7-337b-4312-8646-37533fccfd6a","Type":"ContainerStarted","Data":"cd6394920947639f592ca39556123982d8260328df2d625d29bdbbb3d0679441"} Apr 18 02:46:15.299549 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.299524 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="416c821df20885ea2b8a25fcf9bb0d75fb9537df1242b9071b4ee2c2f3073202" exitCode=0 Apr 18 02:46:15.299684 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.299580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"416c821df20885ea2b8a25fcf9bb0d75fb9537df1242b9071b4ee2c2f3073202"} Apr 18 02:46:15.324337 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:15.324309 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kd89b"] Apr 18 02:46:15.327476 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:46:15.327447 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd15b1f0_3b60_4649_8b1a_c95b3e5fc9ce.slice/crio-2c457a4a1271320a6e4dbdcc73ba76113e8f34ba91b3fa1259c8a1c6b0381fa5 WatchSource:0}: Error finding container 2c457a4a1271320a6e4dbdcc73ba76113e8f34ba91b3fa1259c8a1c6b0381fa5: Status 404 returned error can't find the container with id 2c457a4a1271320a6e4dbdcc73ba76113e8f34ba91b3fa1259c8a1c6b0381fa5 Apr 18 02:46:16.305518 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:16.305482 2575 generic.go:358] "Generic (PLEG): container finished" podID="34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b" containerID="f7bff1add060e50191b5b03429af5f701cee7bc5bef5101ec1b88a0ee1d70e1d" exitCode=0 Apr 18 02:46:16.306182 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:16.305564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerDied","Data":"f7bff1add060e50191b5b03429af5f701cee7bc5bef5101ec1b88a0ee1d70e1d"} Apr 18 02:46:16.307432 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:16.307335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd89b" event={"ID":"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce","Type":"ContainerStarted","Data":"2c457a4a1271320a6e4dbdcc73ba76113e8f34ba91b3fa1259c8a1c6b0381fa5"} Apr 18 02:46:17.089785 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.089744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:17.089974 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.089812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:17.089974 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.089933 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:17.090080 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.090008 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:21.089991052 +0000 UTC m=+40.596453328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:17.090080 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.089943 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:17.090080 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.090033 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:17.090219 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.090089 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:21.090073134 +0000 UTC m=+40.596535396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:17.190438 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.190403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:17.190620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.190470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:17.190620 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.190553 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:17.190620 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.190609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:21.190595609 +0000 UTC m=+40.697057861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:17.190763 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.190553 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:17.190763 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:17.190684 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:21.19067151 +0000 UTC m=+40.697133766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:17.313377 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.313345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" event={"ID":"34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b","Type":"ContainerStarted","Data":"51cd5c6dccf1868d44eb3e65d7c96e80befa5c926979247cfe0cb3c67d371f64"} Apr 18 02:46:17.334808 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:17.334752 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rcpsj" podStartSLOduration=5.560890981 podStartE2EDuration="36.334730714s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:45:43.763455619 +0000 UTC m=+3.269917886" lastFinishedPulling="2026-04-18 02:46:14.537295367 +0000 UTC m=+34.043757619" observedRunningTime="2026-04-18 02:46:17.332767847 +0000 UTC m=+36.839230126" watchObservedRunningTime="2026-04-18 02:46:17.334730714 +0000 UTC m=+36.841192990" Apr 18 02:46:18.317572 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:18.317365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" event={"ID":"5c725ae7-337b-4312-8646-37533fccfd6a","Type":"ContainerStarted","Data":"104ddc4944dea24fd27cf3f06b6620c699af122b1bd56251469f7e09875e3f6d"} Apr 18 02:46:18.318763 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:18.318731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kd89b" event={"ID":"bd15b1f0-3b60-4649-8b1a-c95b3e5fc9ce","Type":"ContainerStarted","Data":"d5c676662ceb451f24148720b7c8bf3d2eed56684dff03938935473a8c375356"} Apr 18 02:46:18.319120 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:18.319097 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:18.332011 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:18.331968 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-96n6r" podStartSLOduration=32.061742887 podStartE2EDuration="35.331953068s" podCreationTimestamp="2026-04-18 02:45:43 +0000 UTC" firstStartedPulling="2026-04-18 02:46:14.515338601 +0000 UTC m=+34.021800868" lastFinishedPulling="2026-04-18 02:46:17.785548797 +0000 UTC m=+37.292011049" observedRunningTime="2026-04-18 02:46:18.330566819 +0000 UTC m=+37.837029096" watchObservedRunningTime="2026-04-18 02:46:18.331953068 +0000 UTC m=+37.838415338" Apr 18 02:46:18.343927 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:18.343879 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kd89b" podStartSLOduration=34.878917159 podStartE2EDuration="37.343867148s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:46:15.329402136 +0000 UTC m=+34.835864387" lastFinishedPulling="2026-04-18 02:46:17.794352124 +0000 UTC m=+37.300814376" observedRunningTime="2026-04-18 02:46:18.34342073 +0000 UTC m=+37.849883006" watchObservedRunningTime="2026-04-18 02:46:18.343867148 +0000 UTC m=+37.850329422" Apr 18 02:46:21.122090 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:21.122052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:21.122228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.122244 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.122270 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.122307 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.122337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:29.122316119 +0000 UTC m=+48.628778385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:21.122580 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.122355 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:29.122346665 +0000 UTC m=+48.628808919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:21.222513 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:21.222477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:21.222688 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:21.222569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:21.222688 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.222618 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:21.222688 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.222663 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:21.222792 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.222709 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:29.222694576 +0000 UTC m=+48.729156841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:21.222792 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:21.222729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:29.222721658 +0000 UTC m=+48.729183912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:29.183928 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.183893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.183946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.184043 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.184103 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.184087689 +0000 UTC m=+64.690549950 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.184100 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.184117 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:29.184279 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.184140 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.184133586 +0000 UTC m=+64.690595838 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:29.284981 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.284948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:29.285106 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.285005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:29.285144 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.285106 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:29.285144 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.285120 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:29.285206 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.285169 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.2851563 +0000 UTC m=+64.791618556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:29.285206 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:29.285183 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:46:45.28517795 +0000 UTC m=+64.791640205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:29.385694 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.385662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:29.388990 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.388953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3797aadb-3c8b-4eda-9fbe-dce172b0710e-original-pull-secret\") pod \"global-pull-secret-syncer-66knh\" (UID: \"3797aadb-3c8b-4eda-9fbe-dce172b0710e\") " pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:29.597579 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.597551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66knh" Apr 18 02:46:29.707456 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:29.707426 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66knh"] Apr 18 02:46:30.341437 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:30.341396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66knh" event={"ID":"3797aadb-3c8b-4eda-9fbe-dce172b0710e","Type":"ContainerStarted","Data":"8a74e8c8a27fdf9951b942a1b2bf5e8da9ca69d7064fae533c2a55b5f3fd3969"} Apr 18 02:46:34.354201 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:34.354152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66knh" event={"ID":"3797aadb-3c8b-4eda-9fbe-dce172b0710e","Type":"ContainerStarted","Data":"4501943f139f1ef9971fd198498681c371f17f491bd9c7d651769986001b0c3b"} Apr 18 02:46:34.367378 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:34.367329 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-66knh" podStartSLOduration=33.173960971 podStartE2EDuration="37.367312656s" podCreationTimestamp="2026-04-18 02:45:57 +0000 UTC" firstStartedPulling="2026-04-18 02:46:29.712394238 +0000 UTC m=+49.218856490" lastFinishedPulling="2026-04-18 02:46:33.905745918 +0000 UTC m=+53.412208175" observedRunningTime="2026-04-18 02:46:34.366834528 +0000 UTC m=+53.873296801" watchObservedRunningTime="2026-04-18 02:46:34.367312656 +0000 UTC m=+53.873774930" Apr 18 02:46:37.293095 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:37.293068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lsz2j" Apr 18 02:46:45.210936 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:45.210890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:45.210955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.211048 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.211066 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.211045 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.211129 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:47:17.211111799 +0000 UTC m=+96.717574066 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:46:45.211313 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.211144 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:17.211135935 +0000 UTC m=+96.717598190 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:46:45.311722 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:45.311681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:46:45.311908 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:45.311740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:46:45.311908 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.311836 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:45.311908 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.311868 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:46:45.311908 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.311899 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:47:17.311884868 +0000 UTC m=+96.818347119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:46:45.312046 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:45.311917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:47:17.311904859 +0000 UTC m=+96.818367111 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:46:47.024268 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:47.024230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:46:47.027177 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:47.027152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 18 02:46:47.035403 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:47.035376 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:46:47.035523 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:46:47.035457 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:51.035434564 +0000 UTC m=+130.541896817 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : secret "metrics-daemon-secret" not found Apr 18 02:46:49.348616 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.348585 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb"] Apr 18 02:46:49.353845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.353820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.356172 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.356148 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 18 02:46:49.356172 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.356167 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 18 02:46:49.361361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.357742 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 18 02:46:49.361361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.358602 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 18 02:46:49.361361 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.360370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb"] Apr 18 02:46:49.442410 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.442368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50cd1245-4ac3-410e-ad5c-38df2947b265-tmp\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.442595 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.442421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj5gq\" (UniqueName: \"kubernetes.io/projected/50cd1245-4ac3-410e-ad5c-38df2947b265-kube-api-access-zj5gq\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.442595 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.442516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/50cd1245-4ac3-410e-ad5c-38df2947b265-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.543091 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.543051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/50cd1245-4ac3-410e-ad5c-38df2947b265-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.543251 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.543160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50cd1245-4ac3-410e-ad5c-38df2947b265-tmp\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.543251 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.543188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj5gq\" (UniqueName: \"kubernetes.io/projected/50cd1245-4ac3-410e-ad5c-38df2947b265-kube-api-access-zj5gq\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.543620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.543602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50cd1245-4ac3-410e-ad5c-38df2947b265-tmp\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.545615 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.545589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/50cd1245-4ac3-410e-ad5c-38df2947b265-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.550520 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.550494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj5gq\" (UniqueName: \"kubernetes.io/projected/50cd1245-4ac3-410e-ad5c-38df2947b265-kube-api-access-zj5gq\") pod \"klusterlet-addon-workmgr-6d6779f76-mw5sb\" (UID: \"50cd1245-4ac3-410e-ad5c-38df2947b265\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.666966 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.666877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:49.780938 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:49.780905 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb"] Apr 18 02:46:49.783992 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:46:49.783957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50cd1245_4ac3_410e_ad5c_38df2947b265.slice/crio-e9fc2718684031362820256708c980777303ceee272db8e4e0b56c8ed72d2375 WatchSource:0}: Error finding container e9fc2718684031362820256708c980777303ceee272db8e4e0b56c8ed72d2375: Status 404 returned error can't find the container with id e9fc2718684031362820256708c980777303ceee272db8e4e0b56c8ed72d2375 Apr 18 02:46:50.324228 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:50.324198 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kd89b" Apr 18 02:46:50.386008 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:50.385970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" event={"ID":"50cd1245-4ac3-410e-ad5c-38df2947b265","Type":"ContainerStarted","Data":"e9fc2718684031362820256708c980777303ceee272db8e4e0b56c8ed72d2375"} Apr 18 02:46:54.397819 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:54.397779 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" event={"ID":"50cd1245-4ac3-410e-ad5c-38df2947b265","Type":"ContainerStarted","Data":"50db56e6c9b88606ee1125e9be30d6c2060a549fbcd4a932ff046cec20831a83"} Apr 18 02:46:54.398262 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:54.398072 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:54.399661 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:54.399621 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:46:54.413726 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:46:54.413686 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" podStartSLOduration=1.762582927 podStartE2EDuration="5.413674118s" podCreationTimestamp="2026-04-18 02:46:49 +0000 UTC" firstStartedPulling="2026-04-18 02:46:49.785750688 +0000 UTC m=+69.292212954" lastFinishedPulling="2026-04-18 02:46:53.43684189 +0000 UTC m=+72.943304145" observedRunningTime="2026-04-18 02:46:54.412872194 +0000 UTC m=+73.919334467" watchObservedRunningTime="2026-04-18 02:46:54.413674118 +0000 UTC m=+73.920136392" Apr 18 02:47:17.260867 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:47:17.260825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:47:17.260867 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:47:17.260874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:47:17.261315 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.260963 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:47:17.261315 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.260974 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:47:17.261315 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.260972 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:47:17.261315 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.261025 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:48:21.261010244 +0000 UTC m=+160.767472497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:47:17.261315 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.261097 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:21.261078355 +0000 UTC m=+160.767540611 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:47:17.361607 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:47:17.361567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:47:17.361779 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:47:17.361619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:47:17.361779 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.361713 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:47:17.361779 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.361771 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:48:21.361758242 +0000 UTC m=+160.868220494 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:47:17.361885 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.361713 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:47:17.361885 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:17.361853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:48:21.361837299 +0000 UTC m=+160.868299551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:47:51.106418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:47:51.106374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:47:51.106896 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:51.106518 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:47:51.106896 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:47:51.106593 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs podName:06d32427-f8ec-4151-bb49-8eaef8308f79 nodeName:}" failed. No retries permitted until 2026-04-18 02:49:53.106578058 +0000 UTC m=+252.613040313 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs") pod "network-metrics-daemon-l8m94" (UID: "06d32427-f8ec-4151-bb49-8eaef8308f79") : secret "metrics-daemon-secret" not found Apr 18 02:48:16.366261 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:16.366213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" podUID="6061e030-0942-41b4-b979-14994114dfae" Apr 18 02:48:16.384415 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:16.384389 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-svnzk" podUID="70a58891-387a-4f51-a0bf-e2abf38cf891" Apr 18 02:48:16.427761 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:16.427733 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" podUID="3c20ea1d-8a16-44d1-8cb9-0310ba00246c" Apr 18 02:48:16.446148 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:16.446116 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-znc79" podUID="6e67776e-f785-425c-9c6b-56b4c10fccaa" Apr 18 02:48:16.549919 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:16.549888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:48:16.549919 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:16.549906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:48:16.550122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:16.549889 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svnzk" Apr 18 02:48:16.550122 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:16.549888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:48:18.182806 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:18.182767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l8m94" podUID="06d32427-f8ec-4151-bb49-8eaef8308f79" Apr 18 02:48:18.779729 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:18.779699 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5nsl5_36d98916-52f7-469a-a004-8f19f9739057/dns-node-resolver/0.log" Apr 18 02:48:20.180436 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:20.180411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kbc68_a03893d3-b1b4-4eac-93ff-24aa692c02ec/node-ca/0.log" Apr 18 02:48:21.331041 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:21.331013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") pod \"image-registry-6f46fc6d96-m2499\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:21.331084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.331156 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.331180 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f46fc6d96-m2499: secret "image-registry-tls" not found Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.331237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls podName:6061e030-0942-41b4-b979-14994114dfae nodeName:}" failed. No retries permitted until 2026-04-18 02:50:23.331221076 +0000 UTC m=+282.837683328 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls") pod "image-registry-6f46fc6d96-m2499" (UID: "6061e030-0942-41b4-b979-14994114dfae") : secret "image-registry-tls" not found Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.331163 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:48:21.331423 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.331291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls podName:70a58891-387a-4f51-a0bf-e2abf38cf891 nodeName:}" failed. No retries permitted until 2026-04-18 02:50:23.331280806 +0000 UTC m=+282.837743057 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls") pod "dns-default-svnzk" (UID: "70a58891-387a-4f51-a0bf-e2abf38cf891") : secret "dns-default-metrics-tls" not found Apr 18 02:48:21.432017 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:21.431984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:48:21.432174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:21.432057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:48:21.432174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.432110 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 18 02:48:21.432174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.432127 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:48:21.432174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.432161 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert podName:3c20ea1d-8a16-44d1-8cb9-0310ba00246c nodeName:}" failed. No retries permitted until 2026-04-18 02:50:23.432148509 +0000 UTC m=+282.938610760 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bfhrs" (UID: "3c20ea1d-8a16-44d1-8cb9-0310ba00246c") : secret "networking-console-plugin-cert" not found Apr 18 02:48:21.432174 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:21.432175 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert podName:6e67776e-f785-425c-9c6b-56b4c10fccaa nodeName:}" failed. No retries permitted until 2026-04-18 02:50:23.432168279 +0000 UTC m=+282.938630531 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert") pod "ingress-canary-znc79" (UID: "6e67776e-f785-425c-9c6b-56b4c10fccaa") : secret "canary-serving-cert" not found Apr 18 02:48:23.127345 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.127303 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj"] Apr 18 02:48:23.131523 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.131499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.133837 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.133813 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 18 02:48:23.133990 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.133813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:23.133990 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.133890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-kfj77\"" Apr 18 02:48:23.134795 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.134780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 18 02:48:23.134858 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.134830 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 18 02:48:23.137016 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.136992 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj"] Apr 18 02:48:23.226307 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.226271 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6"] Apr 18 02:48:23.229209 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.229188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.231429 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.231400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-kj9wh\"" Apr 18 02:48:23.231541 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.231440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 18 02:48:23.231748 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.231735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:23.231815 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.231740 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 18 02:48:23.231863 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.231827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 18 02:48:23.236048 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.236027 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6"] Apr 18 02:48:23.242569 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.242547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b798241c-d3d5-4425-8f62-3533926b33a3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.242687 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.242657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b798241c-d3d5-4425-8f62-3533926b33a3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.242687 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.242679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspr7\" (UniqueName: \"kubernetes.io/projected/b798241c-d3d5-4425-8f62-3533926b33a3-kube-api-access-gspr7\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.343470 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvf9\" (UniqueName: \"kubernetes.io/projected/8ec13592-85bc-4b5d-9c52-7666e8e83aad-kube-api-access-dzvf9\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.343621 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec13592-85bc-4b5d-9c52-7666e8e83aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.343621 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b798241c-d3d5-4425-8f62-3533926b33a3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.343621 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec13592-85bc-4b5d-9c52-7666e8e83aad-config\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.343745 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b798241c-d3d5-4425-8f62-3533926b33a3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.343745 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.343686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gspr7\" (UniqueName: \"kubernetes.io/projected/b798241c-d3d5-4425-8f62-3533926b33a3-kube-api-access-gspr7\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.344105 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.344088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b798241c-d3d5-4425-8f62-3533926b33a3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.345906 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.345888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b798241c-d3d5-4425-8f62-3533926b33a3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.351046 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.351022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspr7\" (UniqueName: \"kubernetes.io/projected/b798241c-d3d5-4425-8f62-3533926b33a3-kube-api-access-gspr7\") pod \"kube-storage-version-migrator-operator-6769c5d45-chlsj\" (UID: \"b798241c-d3d5-4425-8f62-3533926b33a3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.440571 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.440499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" Apr 18 02:48:23.444302 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.444284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvf9\" (UniqueName: \"kubernetes.io/projected/8ec13592-85bc-4b5d-9c52-7666e8e83aad-kube-api-access-dzvf9\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.444388 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.444344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec13592-85bc-4b5d-9c52-7666e8e83aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.444446 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.444418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec13592-85bc-4b5d-9c52-7666e8e83aad-config\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.445051 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.445027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec13592-85bc-4b5d-9c52-7666e8e83aad-config\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.446388 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.446360 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec13592-85bc-4b5d-9c52-7666e8e83aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.451430 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.451402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvf9\" (UniqueName: \"kubernetes.io/projected/8ec13592-85bc-4b5d-9c52-7666e8e83aad-kube-api-access-dzvf9\") pod \"service-ca-operator-d6fc45fc5-xk8d6\" (UID: \"8ec13592-85bc-4b5d-9c52-7666e8e83aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.538138 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.538109 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" Apr 18 02:48:23.552005 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.551978 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj"] Apr 18 02:48:23.554785 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:23.554761 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb798241c_d3d5_4425_8f62_3533926b33a3.slice/crio-6e9289dcd03926d05ea1f923d858a88865d139db6502002dd0f20e8bdf775327 WatchSource:0}: Error finding container 6e9289dcd03926d05ea1f923d858a88865d139db6502002dd0f20e8bdf775327: Status 404 returned error can't find the container with id 6e9289dcd03926d05ea1f923d858a88865d139db6502002dd0f20e8bdf775327 Apr 18 02:48:23.562689 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.562663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" event={"ID":"b798241c-d3d5-4425-8f62-3533926b33a3","Type":"ContainerStarted","Data":"6e9289dcd03926d05ea1f923d858a88865d139db6502002dd0f20e8bdf775327"} Apr 18 02:48:23.648110 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:23.648079 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6"] Apr 18 02:48:23.650659 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:23.650615 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec13592_85bc_4b5d_9c52_7666e8e83aad.slice/crio-24b24e455625a8d669c325858cd28b625eb6810f86d7de55b970bc49708a9755 WatchSource:0}: Error finding container 24b24e455625a8d669c325858cd28b625eb6810f86d7de55b970bc49708a9755: Status 404 returned error can't find the container with id 24b24e455625a8d669c325858cd28b625eb6810f86d7de55b970bc49708a9755 Apr 18 02:48:24.567799 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:24.567763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" event={"ID":"8ec13592-85bc-4b5d-9c52-7666e8e83aad","Type":"ContainerStarted","Data":"24b24e455625a8d669c325858cd28b625eb6810f86d7de55b970bc49708a9755"} Apr 18 02:48:26.574346 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:26.574302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" event={"ID":"b798241c-d3d5-4425-8f62-3533926b33a3","Type":"ContainerStarted","Data":"0c6087d54cfd0d43f5357c2c4ce8dca07d3e52cea395eb4bf935c6705bacf821"} Apr 18 02:48:26.575668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:26.575626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" event={"ID":"8ec13592-85bc-4b5d-9c52-7666e8e83aad","Type":"ContainerStarted","Data":"d8f77c2fe8208808bb2018d24de84c3d668befbea4aeeafa42497a8858409eaa"} Apr 18 02:48:26.589564 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:26.589511 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" podStartSLOduration=1.171061427 podStartE2EDuration="3.589494719s" podCreationTimestamp="2026-04-18 02:48:23 +0000 UTC" firstStartedPulling="2026-04-18 02:48:23.556962835 +0000 UTC m=+163.063425091" lastFinishedPulling="2026-04-18 02:48:25.975396131 +0000 UTC m=+165.481858383" observedRunningTime="2026-04-18 02:48:26.587950979 +0000 UTC m=+166.094413255" watchObservedRunningTime="2026-04-18 02:48:26.589494719 +0000 UTC m=+166.095956995" Apr 18 02:48:26.601314 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:26.601268 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" podStartSLOduration=1.27984086 podStartE2EDuration="3.601251616s" podCreationTimestamp="2026-04-18 02:48:23 +0000 UTC" firstStartedPulling="2026-04-18 02:48:23.65253213 +0000 UTC m=+163.158994386" lastFinishedPulling="2026-04-18 02:48:25.973942886 +0000 UTC m=+165.480405142" observedRunningTime="2026-04-18 02:48:26.600734271 +0000 UTC m=+166.107196543" watchObservedRunningTime="2026-04-18 02:48:26.601251616 +0000 UTC m=+166.107713891" Apr 18 02:48:27.481695 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.481661 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g"] Apr 18 02:48:27.484948 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.484928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" Apr 18 02:48:27.487265 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.487244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-drdnr\"" Apr 18 02:48:27.487366 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.487317 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 18 02:48:27.488109 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.488091 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 18 02:48:27.491256 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.491237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g"] Apr 18 02:48:27.582127 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.582096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxbl\" (UniqueName: \"kubernetes.io/projected/a240b4fc-e94f-406b-a350-71efd88049dc-kube-api-access-fjxbl\") pod \"migrator-74bb7799d9-9bz6g\" (UID: \"a240b4fc-e94f-406b-a350-71efd88049dc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" Apr 18 02:48:27.682802 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.682768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxbl\" (UniqueName: \"kubernetes.io/projected/a240b4fc-e94f-406b-a350-71efd88049dc-kube-api-access-fjxbl\") pod \"migrator-74bb7799d9-9bz6g\" (UID: \"a240b4fc-e94f-406b-a350-71efd88049dc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" Apr 18 02:48:27.690354 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.690330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxbl\" (UniqueName: \"kubernetes.io/projected/a240b4fc-e94f-406b-a350-71efd88049dc-kube-api-access-fjxbl\") pod \"migrator-74bb7799d9-9bz6g\" (UID: \"a240b4fc-e94f-406b-a350-71efd88049dc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" Apr 18 02:48:27.793926 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.793890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" Apr 18 02:48:27.907473 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:27.907437 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g"] Apr 18 02:48:27.910647 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:27.910608 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda240b4fc_e94f_406b_a350_71efd88049dc.slice/crio-31175bfa92e05929238439e00413aa9abb3f0b16ff84a8eecc14407c183f35b4 WatchSource:0}: Error finding container 31175bfa92e05929238439e00413aa9abb3f0b16ff84a8eecc14407c183f35b4: Status 404 returned error can't find the container with id 31175bfa92e05929238439e00413aa9abb3f0b16ff84a8eecc14407c183f35b4 Apr 18 02:48:28.582697 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:28.582658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" event={"ID":"a240b4fc-e94f-406b-a350-71efd88049dc","Type":"ContainerStarted","Data":"31175bfa92e05929238439e00413aa9abb3f0b16ff84a8eecc14407c183f35b4"} Apr 18 02:48:29.588391 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:29.588355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" event={"ID":"a240b4fc-e94f-406b-a350-71efd88049dc","Type":"ContainerStarted","Data":"f35c2b8abef4527100f0cb828de37bcd947dcf8f81e3ce6f33ead9fb8bbc4af5"} Apr 18 02:48:29.588391 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:29.588392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" event={"ID":"a240b4fc-e94f-406b-a350-71efd88049dc","Type":"ContainerStarted","Data":"41547b00fb4d68a04de6cb0815411f5de9341e62b093735b1f11cdd1372c4fc1"} Apr 18 02:48:29.605702 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:29.605650 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9bz6g" podStartSLOduration=1.646279989 podStartE2EDuration="2.605610673s" podCreationTimestamp="2026-04-18 02:48:27 +0000 UTC" firstStartedPulling="2026-04-18 02:48:27.912458592 +0000 UTC m=+167.418920855" lastFinishedPulling="2026-04-18 02:48:28.871789284 +0000 UTC m=+168.378251539" observedRunningTime="2026-04-18 02:48:29.60489636 +0000 UTC m=+169.111358634" watchObservedRunningTime="2026-04-18 02:48:29.605610673 +0000 UTC m=+169.112072950" Apr 18 02:48:31.172626 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:31.172541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:48:52.377757 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.377718 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qct7q"] Apr 18 02:48:52.382800 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.382772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.385968 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.385940 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 18 02:48:52.386902 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.386884 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d27jw\"" Apr 18 02:48:52.387178 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.387165 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 18 02:48:52.387238 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.387221 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 18 02:48:52.387294 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.387279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 18 02:48:52.392174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.392154 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qct7q"] Apr 18 02:48:52.420395 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.420369 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f46fc6d96-m2499"] Apr 18 02:48:52.420594 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:52.420575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" podUID="6061e030-0942-41b4-b979-14994114dfae" Apr 18 02:48:52.468874 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.468838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.469012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.468947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-crio-socket\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.469012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.468989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-data-volume\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.469012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.469008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.469112 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.469025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68m5p\" (UniqueName: \"kubernetes.io/projected/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-api-access-68m5p\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569475 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569441 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cb9d77ddb-7p2zw"] Apr 18 02:48:52.569680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-crio-socket\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-data-volume\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68m5p\" (UniqueName: \"kubernetes.io/projected/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-api-access-68m5p\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-crio-socket\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.569935 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.569782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.570060 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.570039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-data-volume\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.570277 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.570258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.572083 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.572049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.572944 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.572930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.582111 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.582080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cb9d77ddb-7p2zw"] Apr 18 02:48:52.585450 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.585427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68m5p\" (UniqueName: \"kubernetes.io/projected/e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae-kube-api-access-68m5p\") pod \"insights-runtime-extractor-qct7q\" (UID: \"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae\") " pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.640153 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.640074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:48:52.644442 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.644423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:48:52.670615 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670737 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670655 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670737 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670677 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgdw\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670737 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670697 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670737 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670719 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670763 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670795 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration\") pod \"6061e030-0942-41b4-b979-14994114dfae\" (UID: \"6061e030-0942-41b4-b979-14994114dfae\") " Apr 18 02:48:52.670922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:48:52.670922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-trusted-ca\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671110 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-installation-pull-secrets\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671110 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.670956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-tls\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671110 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-certificates\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671110 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-image-registry-private-configuration\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49cs\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-kube-api-access-x49cs\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e60772a-9186-4b5d-a11e-dd851e220d6f-ca-trust-extracted\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-bound-sa-token\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671219 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671252 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6061e030-0942-41b4-b979-14994114dfae-ca-trust-extracted\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.671306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.671280 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:48:52.673088 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.673058 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:48:52.673088 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.673070 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:48:52.673207 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.673114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw" (OuterVolumeSpecName: "kube-api-access-nxgdw") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "kube-api-access-nxgdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:48:52.673207 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.673122 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6061e030-0942-41b4-b979-14994114dfae" (UID: "6061e030-0942-41b4-b979-14994114dfae"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:48:52.691492 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.691473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qct7q" Apr 18 02:48:52.772147 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-certificates\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-image-registry-private-configuration\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x49cs\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-kube-api-access-x49cs\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e60772a-9186-4b5d-a11e-dd851e220d6f-ca-trust-extracted\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-bound-sa-token\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-trusted-ca\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-installation-pull-secrets\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-tls\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772529 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-registry-certificates\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772557 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxgdw\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-kube-api-access-nxgdw\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772571 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-bound-sa-token\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772585 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-installation-pull-secrets\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772599 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6061e030-0942-41b4-b979-14994114dfae-trusted-ca\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772614 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6061e030-0942-41b4-b979-14994114dfae-image-registry-private-configuration\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:52.772997 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.772969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e60772a-9186-4b5d-a11e-dd851e220d6f-ca-trust-extracted\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.773086 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.773014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-certificates\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.773691 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.773664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e60772a-9186-4b5d-a11e-dd851e220d6f-trusted-ca\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.774971 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.774935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-registry-tls\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.775428 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.775405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-installation-pull-secrets\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.775473 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.775447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4e60772a-9186-4b5d-a11e-dd851e220d6f-image-registry-private-configuration\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.780609 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.780590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-bound-sa-token\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.780860 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.780843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49cs\" (UniqueName: \"kubernetes.io/projected/4e60772a-9186-4b5d-a11e-dd851e220d6f-kube-api-access-x49cs\") pod \"image-registry-5cb9d77ddb-7p2zw\" (UID: \"4e60772a-9186-4b5d-a11e-dd851e220d6f\") " pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:52.806593 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.806570 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qct7q"] Apr 18 02:48:52.809922 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:52.809899 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode31d5b95_e8ca_43c7_8ab9_bfe113cc6eae.slice/crio-f4cc0d588b2a01ed1d390b7fbe7cb3c86522e8168e60723990759487d69e0554 WatchSource:0}: Error finding container f4cc0d588b2a01ed1d390b7fbe7cb3c86522e8168e60723990759487d69e0554: Status 404 returned error can't find the container with id f4cc0d588b2a01ed1d390b7fbe7cb3c86522e8168e60723990759487d69e0554 Apr 18 02:48:52.887457 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.887424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxh9b\"" Apr 18 02:48:52.896359 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:52.896293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:53.007297 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.007261 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cb9d77ddb-7p2zw"] Apr 18 02:48:53.009750 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:53.009728 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e60772a_9186_4b5d_a11e_dd851e220d6f.slice/crio-ec9680843fc96a0225412129b8267681d9413c68a0bd66f537625f305be3f720 WatchSource:0}: Error finding container ec9680843fc96a0225412129b8267681d9413c68a0bd66f537625f305be3f720: Status 404 returned error can't find the container with id ec9680843fc96a0225412129b8267681d9413c68a0bd66f537625f305be3f720 Apr 18 02:48:53.643622 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.643585 2575 generic.go:358] "Generic (PLEG): container finished" podID="50cd1245-4ac3-410e-ad5c-38df2947b265" containerID="50db56e6c9b88606ee1125e9be30d6c2060a549fbcd4a932ff046cec20831a83" exitCode=1 Apr 18 02:48:53.644049 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.643675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" event={"ID":"50cd1245-4ac3-410e-ad5c-38df2947b265","Type":"ContainerDied","Data":"50db56e6c9b88606ee1125e9be30d6c2060a549fbcd4a932ff046cec20831a83"} Apr 18 02:48:53.644115 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.644045 2575 scope.go:117] "RemoveContainer" containerID="50db56e6c9b88606ee1125e9be30d6c2060a549fbcd4a932ff046cec20831a83" Apr 18 02:48:53.645211 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.645177 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" event={"ID":"4e60772a-9186-4b5d-a11e-dd851e220d6f","Type":"ContainerStarted","Data":"219af333f773c167e1957fae5c11c01f2de0e3287d00f34b93d2d68b0b40ed52"} Apr 18 02:48:53.645340 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.645221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" event={"ID":"4e60772a-9186-4b5d-a11e-dd851e220d6f","Type":"ContainerStarted","Data":"ec9680843fc96a0225412129b8267681d9413c68a0bd66f537625f305be3f720"} Apr 18 02:48:53.645340 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.645298 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:48:53.646827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.646802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qct7q" event={"ID":"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae","Type":"ContainerStarted","Data":"c2f4e8c1f3775c2a67a8bae6a38effdf95ac29281ad7c8fa91cd4bd25994ac62"} Apr 18 02:48:53.646827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.646830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qct7q" event={"ID":"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae","Type":"ContainerStarted","Data":"5e9eaa38317a77bb533e64ab33660ec4dd1af3d77829acd59e04dc6465cb8be5"} Apr 18 02:48:53.646993 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.646840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qct7q" event={"ID":"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae","Type":"ContainerStarted","Data":"f4cc0d588b2a01ed1d390b7fbe7cb3c86522e8168e60723990759487d69e0554"} Apr 18 02:48:53.646993 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.646850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46fc6d96-m2499" Apr 18 02:48:53.684452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.684432 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f46fc6d96-m2499"] Apr 18 02:48:53.689509 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.689490 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f46fc6d96-m2499"] Apr 18 02:48:53.707246 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.707163 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" podStartSLOduration=1.707148139 podStartE2EDuration="1.707148139s" podCreationTimestamp="2026-04-18 02:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:48:53.706756339 +0000 UTC m=+193.213218620" watchObservedRunningTime="2026-04-18 02:48:53.707148139 +0000 UTC m=+193.213610412" Apr 18 02:48:53.782259 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:53.782228 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6061e030-0942-41b4-b979-14994114dfae-registry-tls\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:48:54.399051 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:54.399017 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:48:54.651310 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:54.651225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" event={"ID":"50cd1245-4ac3-410e-ad5c-38df2947b265","Type":"ContainerStarted","Data":"560a5e11fe7c7346177d83c452a55c9cc28111053c862f03d4dab672a323ef47"} Apr 18 02:48:54.651772 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:54.651476 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:48:54.652100 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:54.652082 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d6779f76-mw5sb" Apr 18 02:48:55.174350 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:55.174321 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6061e030-0942-41b4-b979-14994114dfae" path="/var/lib/kubelet/pods/6061e030-0942-41b4-b979-14994114dfae/volumes" Apr 18 02:48:55.655620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:55.655581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qct7q" event={"ID":"e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae","Type":"ContainerStarted","Data":"deb80e5c221534a20bfeef66f7227864794cfb7bf78877e9aa13886325f95e6b"} Apr 18 02:48:55.672281 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:55.672242 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qct7q" podStartSLOduration=1.8263581279999999 podStartE2EDuration="3.672227595s" podCreationTimestamp="2026-04-18 02:48:52 +0000 UTC" firstStartedPulling="2026-04-18 02:48:52.864296825 +0000 UTC m=+192.370759093" lastFinishedPulling="2026-04-18 02:48:54.710166293 +0000 UTC m=+194.216628560" observedRunningTime="2026-04-18 02:48:55.671452043 +0000 UTC m=+195.177914317" watchObservedRunningTime="2026-04-18 02:48:55.672227595 +0000 UTC m=+195.178689851" Apr 18 02:48:58.469769 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.469733 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7ltnz"] Apr 18 02:48:58.474293 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.474270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.477541 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477560 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477607 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477662 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 18 02:48:58.477685 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.477514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-7vjrd\"" Apr 18 02:48:58.483622 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.483603 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nqk4b"] Apr 18 02:48:58.486788 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.486770 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7ltnz"] Apr 18 02:48:58.486893 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.486882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.488918 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.488898 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 18 02:48:58.488994 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.488938 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 18 02:48:58.489605 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.489588 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 18 02:48:58.489708 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.489671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pzdgw\"" Apr 18 02:48:58.520313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-wtmp\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520313 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6w5k\" (UniqueName: \"kubernetes.io/projected/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-api-access-c6w5k\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-root\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5j6\" (UniqueName: \"kubernetes.io/projected/e821f22e-c172-40b8-95f5-11d1e334faae-kube-api-access-9k5j6\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-metrics-client-ca\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-sys\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-textfile\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.520849 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.520831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-metrics-client-ca\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621440 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-sys\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.621659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-textfile\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-sys\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.621659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-wtmp\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-textfile\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6w5k\" (UniqueName: \"kubernetes.io/projected/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-api-access-c6w5k\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-root\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.621922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.621961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-root\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5j6\" (UniqueName: \"kubernetes.io/projected/e821f22e-c172-40b8-95f5-11d1e334faae-kube-api-access-9k5j6\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-metrics-client-ca\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:58.622183 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 18 02:48:58.622223 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.622544 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622544 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:48:58.622249 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls podName:e821f22e-c172-40b8-95f5-11d1e334faae nodeName:}" failed. No retries permitted until 2026-04-18 02:48:59.122229855 +0000 UTC m=+198.628692126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls") pod "node-exporter-nqk4b" (UID: "e821f22e-c172-40b8-95f5-11d1e334faae") : secret "node-exporter-tls" not found Apr 18 02:48:58.622544 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-wtmp\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.622911 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.622911 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.622896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.624138 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.624108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.624442 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.624423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.624482 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.624459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.632805 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.632782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6w5k\" (UniqueName: \"kubernetes.io/projected/0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26-kube-api-access-c6w5k\") pod \"kube-state-metrics-69db897b98-7ltnz\" (UID: \"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.632883 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.632788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5j6\" (UniqueName: \"kubernetes.io/projected/e821f22e-c172-40b8-95f5-11d1e334faae-kube-api-access-9k5j6\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:58.784496 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.784460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" Apr 18 02:48:58.897320 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:58.897288 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7ltnz"] Apr 18 02:48:58.900450 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:58.900422 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0338ba2e_ceb9_4c4d_a133_5f3eb8a5ba26.slice/crio-91fc7fa9894f89bf6c3c57d9d8b1b03a01b82ecf17a8240d0a555a3233c3b790 WatchSource:0}: Error finding container 91fc7fa9894f89bf6c3c57d9d8b1b03a01b82ecf17a8240d0a555a3233c3b790: Status 404 returned error can't find the container with id 91fc7fa9894f89bf6c3c57d9d8b1b03a01b82ecf17a8240d0a555a3233c3b790 Apr 18 02:48:59.125761 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:59.125667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:59.127939 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:59.127916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e821f22e-c172-40b8-95f5-11d1e334faae-node-exporter-tls\") pod \"node-exporter-nqk4b\" (UID: \"e821f22e-c172-40b8-95f5-11d1e334faae\") " pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:59.396074 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:59.395992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nqk4b" Apr 18 02:48:59.404043 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:48:59.404018 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode821f22e_c172_40b8_95f5_11d1e334faae.slice/crio-058be589da25e854383a9d05f94ac8718a3bbb2233b52cbea633274de4035cab WatchSource:0}: Error finding container 058be589da25e854383a9d05f94ac8718a3bbb2233b52cbea633274de4035cab: Status 404 returned error can't find the container with id 058be589da25e854383a9d05f94ac8718a3bbb2233b52cbea633274de4035cab Apr 18 02:48:59.666386 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:59.666296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqk4b" event={"ID":"e821f22e-c172-40b8-95f5-11d1e334faae","Type":"ContainerStarted","Data":"058be589da25e854383a9d05f94ac8718a3bbb2233b52cbea633274de4035cab"} Apr 18 02:48:59.667539 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:48:59.667502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" event={"ID":"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26","Type":"ContainerStarted","Data":"91fc7fa9894f89bf6c3c57d9d8b1b03a01b82ecf17a8240d0a555a3233c3b790"} Apr 18 02:49:00.671237 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.671157 2575 generic.go:358] "Generic (PLEG): container finished" podID="e821f22e-c172-40b8-95f5-11d1e334faae" containerID="2d710b5aec62248d02b701909418fa6203b17ee3aee27e5b763ef749ada29814" exitCode=0 Apr 18 02:49:00.671653 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.671248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqk4b" event={"ID":"e821f22e-c172-40b8-95f5-11d1e334faae","Type":"ContainerDied","Data":"2d710b5aec62248d02b701909418fa6203b17ee3aee27e5b763ef749ada29814"} Apr 18 02:49:00.677725 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.677699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" event={"ID":"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26","Type":"ContainerStarted","Data":"9699478f16fafed0cb71603a8d559fa765854e9852225614f26a22b84ea74926"} Apr 18 02:49:00.677828 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.677729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" event={"ID":"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26","Type":"ContainerStarted","Data":"42982bd28403cd8104b3021118e01e8e53d5e50ec162cf79d3fe949afdf40e2d"} Apr 18 02:49:00.677828 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.677740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" event={"ID":"0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26","Type":"ContainerStarted","Data":"7e1011ee60098f55887b9194543db15aa39fc9abb799598a25119d30ac45721f"} Apr 18 02:49:00.702652 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:00.702588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7ltnz" podStartSLOduration=1.299108431 podStartE2EDuration="2.702575113s" podCreationTimestamp="2026-04-18 02:48:58 +0000 UTC" firstStartedPulling="2026-04-18 02:48:58.902271374 +0000 UTC m=+198.408733625" lastFinishedPulling="2026-04-18 02:49:00.305738042 +0000 UTC m=+199.812200307" observedRunningTime="2026-04-18 02:49:00.702111429 +0000 UTC m=+200.208573713" watchObservedRunningTime="2026-04-18 02:49:00.702575113 +0000 UTC m=+200.209037386" Apr 18 02:49:01.682019 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:01.681978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqk4b" event={"ID":"e821f22e-c172-40b8-95f5-11d1e334faae","Type":"ContainerStarted","Data":"a119a31bc5e2ba1e4d4e4afd9fb6d9aed275af6586e9e51ecbeba966c48ebe57"} Apr 18 02:49:01.682019 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:01.682023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqk4b" event={"ID":"e821f22e-c172-40b8-95f5-11d1e334faae","Type":"ContainerStarted","Data":"d0a4e16bfe94fdf783a08fe019e32408c6947b39d923f25aa0524367c8eb09bc"} Apr 18 02:49:01.699609 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:01.699562 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nqk4b" podStartSLOduration=2.797918163 podStartE2EDuration="3.699547285s" podCreationTimestamp="2026-04-18 02:48:58 +0000 UTC" firstStartedPulling="2026-04-18 02:48:59.405534089 +0000 UTC m=+198.911996340" lastFinishedPulling="2026-04-18 02:49:00.307163196 +0000 UTC m=+199.813625462" observedRunningTime="2026-04-18 02:49:01.698285961 +0000 UTC m=+201.204748235" watchObservedRunningTime="2026-04-18 02:49:01.699547285 +0000 UTC m=+201.206009558" Apr 18 02:49:03.225002 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.224969 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb"] Apr 18 02:49:03.228144 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.228127 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:03.230184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.230161 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-njpm9\"" Apr 18 02:49:03.230306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.230286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 18 02:49:03.234077 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.234058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb"] Apr 18 02:49:03.361059 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.361016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65gtb\" (UID: \"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:03.462553 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.462510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65gtb\" (UID: \"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:03.462743 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:49:03.462724 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 18 02:49:03.462856 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:49:03.462842 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert podName:daf6871f-e9b0-4fbf-a098-7d7d62a2bf28 nodeName:}" failed. No retries permitted until 2026-04-18 02:49:03.962814187 +0000 UTC m=+203.469276453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-65gtb" (UID: "daf6871f-e9b0-4fbf-a098-7d7d62a2bf28") : secret "monitoring-plugin-cert" not found Apr 18 02:49:03.966788 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.966753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65gtb\" (UID: \"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:03.969163 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:03.969127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daf6871f-e9b0-4fbf-a098-7d7d62a2bf28-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-65gtb\" (UID: \"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:04.137410 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.137382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:04.246847 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.246777 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb"] Apr 18 02:49:04.249944 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:49:04.249915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf6871f_e9b0_4fbf_a098_7d7d62a2bf28.slice/crio-7dd9d25980bf23ed42227fbb17befa3fc91916d673fff302c5aefaa3dece3db5 WatchSource:0}: Error finding container 7dd9d25980bf23ed42227fbb17befa3fc91916d673fff302c5aefaa3dece3db5: Status 404 returned error can't find the container with id 7dd9d25980bf23ed42227fbb17befa3fc91916d673fff302c5aefaa3dece3db5 Apr 18 02:49:04.684709 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.684679 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:49:04.689374 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.689351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.691827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-swkgn\"" Apr 18 02:49:04.691827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 18 02:49:04.691827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 18 02:49:04.691827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" event={"ID":"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28","Type":"ContainerStarted","Data":"7dd9d25980bf23ed42227fbb17befa3fc91916d673fff302c5aefaa3dece3db5"} Apr 18 02:49:04.691827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 18 02:49:04.692124 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.691895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 18 02:49:04.692162 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692149 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 18 02:49:04.692617 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 18 02:49:04.692617 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 18 02:49:04.692771 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 18 02:49:04.692771 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692723 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 18 02:49:04.693001 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.692985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 18 02:49:04.693034 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.693009 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bjapjtdg1ard9\"" Apr 18 02:49:04.693183 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.693168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 18 02:49:04.693247 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.693183 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 18 02:49:04.694803 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.694784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 18 02:49:04.706164 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.706140 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:49:04.774492 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774680 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.774845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfhs\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775036 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775036 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775036 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775036 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.774973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775036 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.775032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.775065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.775100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.775212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.775129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.875956 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.875913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.875956 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.875960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876191 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876191 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876298 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876298 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876298 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876954 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876954 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfhs\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.876954 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.876723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.877105 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.877073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.877711 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.877686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.877820 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.877776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.878079 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.878055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.878288 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.878266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.880144 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.880009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.880690 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.880217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.880690 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.880407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.880814 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.880747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881224 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881389 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881480 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881600 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881843 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.881921 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.881854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.882211 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.882193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.882565 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.882548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.885884 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.885864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfhs\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs\") pod \"prometheus-k8s-0\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:04.999540 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:04.999450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:05.129895 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.129872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:49:05.132687 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:49:05.132654 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea22cbf_4a31_459d_9ed7_bd889e3f7832.slice/crio-784f065af235cfa5fae1f6db4ab65a1bc774ba81a16a70024d6a305f1dbe9b77 WatchSource:0}: Error finding container 784f065af235cfa5fae1f6db4ab65a1bc774ba81a16a70024d6a305f1dbe9b77: Status 404 returned error can't find the container with id 784f065af235cfa5fae1f6db4ab65a1bc774ba81a16a70024d6a305f1dbe9b77 Apr 18 02:49:05.703403 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.703366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"784f065af235cfa5fae1f6db4ab65a1bc774ba81a16a70024d6a305f1dbe9b77"} Apr 18 02:49:05.704998 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.704970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" event={"ID":"daf6871f-e9b0-4fbf-a098-7d7d62a2bf28","Type":"ContainerStarted","Data":"4c666c5669212eabebf2d75df01083cfaf121a9e7c33d204b99a3a73cd198368"} Apr 18 02:49:05.705260 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.705227 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:05.710841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.710822 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" Apr 18 02:49:05.720413 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:05.720368 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-65gtb" podStartSLOduration=1.350937956 podStartE2EDuration="2.720357061s" podCreationTimestamp="2026-04-18 02:49:03 +0000 UTC" firstStartedPulling="2026-04-18 02:49:04.251790825 +0000 UTC m=+203.758253090" lastFinishedPulling="2026-04-18 02:49:05.621209942 +0000 UTC m=+205.127672195" observedRunningTime="2026-04-18 02:49:05.719256034 +0000 UTC m=+205.225718309" watchObservedRunningTime="2026-04-18 02:49:05.720357061 +0000 UTC m=+205.226819335" Apr 18 02:49:06.709936 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:06.709900 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" exitCode=0 Apr 18 02:49:06.710377 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:06.709988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} Apr 18 02:49:09.719577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:09.719543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} Apr 18 02:49:09.719577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:09.719578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} Apr 18 02:49:11.728242 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:11.728198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} Apr 18 02:49:11.728242 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:11.728240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} Apr 18 02:49:11.728714 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:11.728254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} Apr 18 02:49:11.728714 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:11.728265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerStarted","Data":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} Apr 18 02:49:11.755341 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:11.755287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.865828692 podStartE2EDuration="7.755272671s" podCreationTimestamp="2026-04-18 02:49:04 +0000 UTC" firstStartedPulling="2026-04-18 02:49:05.134973998 +0000 UTC m=+204.641436251" lastFinishedPulling="2026-04-18 02:49:11.024417978 +0000 UTC m=+210.530880230" observedRunningTime="2026-04-18 02:49:11.75372527 +0000 UTC m=+211.260187545" watchObservedRunningTime="2026-04-18 02:49:11.755272671 +0000 UTC m=+211.261734945" Apr 18 02:49:14.655463 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:14.655434 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cb9d77ddb-7p2zw" Apr 18 02:49:14.999979 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:14.999894 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:49:36.793284 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:36.793249 2575 generic.go:358] "Generic (PLEG): container finished" podID="b798241c-d3d5-4425-8f62-3533926b33a3" containerID="0c6087d54cfd0d43f5357c2c4ce8dca07d3e52cea395eb4bf935c6705bacf821" exitCode=0 Apr 18 02:49:36.793705 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:36.793323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" event={"ID":"b798241c-d3d5-4425-8f62-3533926b33a3","Type":"ContainerDied","Data":"0c6087d54cfd0d43f5357c2c4ce8dca07d3e52cea395eb4bf935c6705bacf821"} Apr 18 02:49:36.793705 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:36.793650 2575 scope.go:117] "RemoveContainer" containerID="0c6087d54cfd0d43f5357c2c4ce8dca07d3e52cea395eb4bf935c6705bacf821" Apr 18 02:49:37.797287 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:37.797253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-chlsj" event={"ID":"b798241c-d3d5-4425-8f62-3533926b33a3","Type":"ContainerStarted","Data":"3be6ff6016d2f013f33120b6384e38451cb36b53b5acfe054b90d63b284e800e"} Apr 18 02:49:41.809281 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:41.809250 2575 generic.go:358] "Generic (PLEG): container finished" podID="8ec13592-85bc-4b5d-9c52-7666e8e83aad" containerID="d8f77c2fe8208808bb2018d24de84c3d668befbea4aeeafa42497a8858409eaa" exitCode=0 Apr 18 02:49:41.809744 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:41.809311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" event={"ID":"8ec13592-85bc-4b5d-9c52-7666e8e83aad","Type":"ContainerDied","Data":"d8f77c2fe8208808bb2018d24de84c3d668befbea4aeeafa42497a8858409eaa"} Apr 18 02:49:41.809744 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:41.809611 2575 scope.go:117] "RemoveContainer" containerID="d8f77c2fe8208808bb2018d24de84c3d668befbea4aeeafa42497a8858409eaa" Apr 18 02:49:42.813755 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:42.813721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xk8d6" event={"ID":"8ec13592-85bc-4b5d-9c52-7666e8e83aad","Type":"ContainerStarted","Data":"af611514a0851bf6a7ec309a22b0c983a39b408aac17dc3bb6cfc4ba905cb135"} Apr 18 02:49:53.200574 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.200536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:49:53.202894 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.202870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06d32427-f8ec-4151-bb49-8eaef8308f79-metrics-certs\") pod \"network-metrics-daemon-l8m94\" (UID: \"06d32427-f8ec-4151-bb49-8eaef8308f79\") " pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:49:53.375828 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.375796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:49:53.384679 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.384661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8m94" Apr 18 02:49:53.497414 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.497354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l8m94"] Apr 18 02:49:53.499878 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:49:53.499850 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d32427_f8ec_4151_bb49_8eaef8308f79.slice/crio-b554afdd268e9db031254643a1d54e598638e800f3d79c74781281a8b05f6ba0 WatchSource:0}: Error finding container b554afdd268e9db031254643a1d54e598638e800f3d79c74781281a8b05f6ba0: Status 404 returned error can't find the container with id b554afdd268e9db031254643a1d54e598638e800f3d79c74781281a8b05f6ba0 Apr 18 02:49:53.848248 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:53.848212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8m94" event={"ID":"06d32427-f8ec-4151-bb49-8eaef8308f79","Type":"ContainerStarted","Data":"b554afdd268e9db031254643a1d54e598638e800f3d79c74781281a8b05f6ba0"} Apr 18 02:49:54.852126 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:54.852099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8m94" event={"ID":"06d32427-f8ec-4151-bb49-8eaef8308f79","Type":"ContainerStarted","Data":"c34177f2ea4ffb000262a74a438be94e8e001434861095978de12bf931b086e1"} Apr 18 02:49:55.855934 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:55.855899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8m94" event={"ID":"06d32427-f8ec-4151-bb49-8eaef8308f79","Type":"ContainerStarted","Data":"dc36d16e11d4281aaf9230188940258940f600a8bf9f1dfde0d916bf981786be"} Apr 18 02:49:55.870453 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:49:55.870397 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l8m94" podStartSLOduration=253.713323234 podStartE2EDuration="4m14.870380763s" podCreationTimestamp="2026-04-18 02:45:41 +0000 UTC" firstStartedPulling="2026-04-18 02:49:53.501679272 +0000 UTC m=+253.008141524" lastFinishedPulling="2026-04-18 02:49:54.658736798 +0000 UTC m=+254.165199053" observedRunningTime="2026-04-18 02:49:55.869492376 +0000 UTC m=+255.375954649" watchObservedRunningTime="2026-04-18 02:49:55.870380763 +0000 UTC m=+255.376843037" Apr 18 02:50:04.999728 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:04.999620 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:05.019006 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:05.018979 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:05.900130 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:05.900096 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:19.550457 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:19.550410 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-znc79" podUID="6e67776e-f785-425c-9c6b-56b4c10fccaa" Apr 18 02:50:19.550457 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:19.550410 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" podUID="3c20ea1d-8a16-44d1-8cb9-0310ba00246c" Apr 18 02:50:19.550927 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:19.550540 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-svnzk" podUID="70a58891-387a-4f51-a0bf-e2abf38cf891" Apr 18 02:50:19.923397 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:19.923306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:50:19.923563 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:19.923446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:50:19.923650 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:19.923567 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:23.024691 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.024650 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:23.025267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025087 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="prometheus" containerID="cri-o://0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" gracePeriod=600 Apr 18 02:50:23.025267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025132 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="thanos-sidecar" containerID="cri-o://715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" gracePeriod=600 Apr 18 02:50:23.025267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025141 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="config-reloader" containerID="cri-o://260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" gracePeriod=600 Apr 18 02:50:23.025267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025168 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-thanos" containerID="cri-o://6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" gracePeriod=600 Apr 18 02:50:23.025267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025247 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy" containerID="cri-o://872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" gracePeriod=600 Apr 18 02:50:23.025555 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.025135 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-web" containerID="cri-o://75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" gracePeriod=600 Apr 18 02:50:23.262167 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.262144 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:23.353157 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353075 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353157 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353107 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353157 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353133 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353157 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353158 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353191 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353305 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353352 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353382 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353422 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.353724 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.353472 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.354069 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354035 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:23.354259 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354229 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:23.354481 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354461 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.354656 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354620 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnfhs\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.354794 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354777 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.354918 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354904 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.355109 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.355095 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.355328 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.355195 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.355427 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.355416 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.355522 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.355509 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls\") pod \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\" (UID: \"2ea22cbf-4a31-459d-9ed7-bd889e3f7832\") " Apr 18 02:50:23.356357 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.354902 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:50:23.356452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.355292 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:23.356452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.356322 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:23.356821 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.356799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:23.357792 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.357549 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.357792 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.357590 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:23.358414 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.358386 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs" (OuterVolumeSpecName: "kube-api-access-vnfhs") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "kube-api-access-vnfhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:50:23.358498 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.358453 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.359473 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.359449 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.359565 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.359480 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.359565 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.359495 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-k8s-db\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.359565 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.359509 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.359565 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.359529 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-configmap-metrics-client-ca\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.361815 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.361789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70a58891-387a-4f51-a0bf-e2abf38cf891-metrics-tls\") pod \"dns-default-svnzk\" (UID: \"70a58891-387a-4f51-a0bf-e2abf38cf891\") " pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:23.362614 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.362591 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.362926 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.362901 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.363011 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.362906 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.363100 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.363003 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.363538 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.363507 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config" (OuterVolumeSpecName: "config") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.363538 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.363515 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out" (OuterVolumeSpecName: "config-out") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:50:23.363854 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.363831 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.364294 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.364278 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:50:23.369961 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.369942 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config" (OuterVolumeSpecName: "web-config") pod "2ea22cbf-4a31-459d-9ed7-bd889e3f7832" (UID: "2ea22cbf-4a31-459d-9ed7-bd889e3f7832"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:23.460384 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460450 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460462 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnfhs\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-kube-api-access-vnfhs\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460472 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config-out\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460482 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-config\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460490 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-metrics-client-certs\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460498 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-tls-assets\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460507 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-grpc-tls\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460515 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460523 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-web-config\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460532 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460540 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.460547 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460549 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.461118 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.460558 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ea22cbf-4a31-459d-9ed7-bd889e3f7832-secret-kube-rbac-proxy\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:50:23.462787 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.462761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3c20ea1d-8a16-44d1-8cb9-0310ba00246c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bfhrs\" (UID: \"3c20ea1d-8a16-44d1-8cb9-0310ba00246c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:50:23.462877 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.462796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e67776e-f785-425c-9c6b-56b4c10fccaa-cert\") pod \"ingress-canary-znc79\" (UID: \"6e67776e-f785-425c-9c6b-56b4c10fccaa\") " pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:50:23.527367 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.527335 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:50:23.527508 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.527371 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:50:23.527508 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.527476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xds92\"" Apr 18 02:50:23.535589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.535562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:23.535589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.535585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" Apr 18 02:50:23.535709 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.535577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-znc79" Apr 18 02:50:23.668173 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.668142 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-svnzk"] Apr 18 02:50:23.671796 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:50:23.671769 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70a58891_387a_4f51_a0bf_e2abf38cf891.slice/crio-d2c62cf84cda3841cfd8a8f2e22e6e7c28ead1b3ae9117ae874d3093a72e97dd WatchSource:0}: Error finding container d2c62cf84cda3841cfd8a8f2e22e6e7c28ead1b3ae9117ae874d3093a72e97dd: Status 404 returned error can't find the container with id d2c62cf84cda3841cfd8a8f2e22e6e7c28ead1b3ae9117ae874d3093a72e97dd Apr 18 02:50:23.895418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.895339 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs"] Apr 18 02:50:23.896506 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.896479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-znc79"] Apr 18 02:50:23.897928 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:50:23.897901 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c20ea1d_8a16_44d1_8cb9_0310ba00246c.slice/crio-19715386f039e93854236083d1215bec0fc837262ccbdbf0b16da4b619e9c019 WatchSource:0}: Error finding container 19715386f039e93854236083d1215bec0fc837262ccbdbf0b16da4b619e9c019: Status 404 returned error can't find the container with id 19715386f039e93854236083d1215bec0fc837262ccbdbf0b16da4b619e9c019 Apr 18 02:50:23.898665 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:50:23.898650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e67776e_f785_425c_9c6b_56b4c10fccaa.slice/crio-062eb67745fe800a197f75f46b6e1259eba22838a16d28149230eca4087f5084 WatchSource:0}: Error finding container 062eb67745fe800a197f75f46b6e1259eba22838a16d28149230eca4087f5084: Status 404 returned error can't find the container with id 062eb67745fe800a197f75f46b6e1259eba22838a16d28149230eca4087f5084 Apr 18 02:50:23.936100 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.936069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-znc79" event={"ID":"6e67776e-f785-425c-9c6b-56b4c10fccaa","Type":"ContainerStarted","Data":"062eb67745fe800a197f75f46b6e1259eba22838a16d28149230eca4087f5084"} Apr 18 02:50:23.937052 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.937031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svnzk" event={"ID":"70a58891-387a-4f51-a0bf-e2abf38cf891","Type":"ContainerStarted","Data":"d2c62cf84cda3841cfd8a8f2e22e6e7c28ead1b3ae9117ae874d3093a72e97dd"} Apr 18 02:50:23.937897 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.937875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" event={"ID":"3c20ea1d-8a16-44d1-8cb9-0310ba00246c","Type":"ContainerStarted","Data":"19715386f039e93854236083d1215bec0fc837262ccbdbf0b16da4b619e9c019"} Apr 18 02:50:23.940266 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940245 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" exitCode=0 Apr 18 02:50:23.940266 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940266 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" exitCode=0 Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940273 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" exitCode=0 Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940280 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" exitCode=0 Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940285 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" exitCode=0 Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940290 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" exitCode=0 Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} Apr 18 02:50:23.940369 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} Apr 18 02:50:23.940623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940371 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:23.940623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} Apr 18 02:50:23.940623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} Apr 18 02:50:23.940623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ea22cbf-4a31-459d-9ed7-bd889e3f7832","Type":"ContainerDied","Data":"784f065af235cfa5fae1f6db4ab65a1bc774ba81a16a70024d6a305f1dbe9b77"} Apr 18 02:50:23.940623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.940413 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.948431 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.948408 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.954713 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.954696 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.961737 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.961670 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.961841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.961825 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:23.965576 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.965557 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:23.968241 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.968225 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.974611 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.974589 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.981758 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.981740 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.987856 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.987841 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.988104 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.988087 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.988146 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988112 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.988146 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988140 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.988351 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.988336 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.988389 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988355 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.988389 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988369 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.988548 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.988533 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.988594 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988551 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.988594 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988563 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.988815 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.988744 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.988815 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988768 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.988815 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.988788 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.989099 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.989077 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.989198 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.989104 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.989198 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.989124 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.989620 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.989594 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.989721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.989626 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.989721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.989694 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.989994 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:50:23.989976 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.990067 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.989997 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.990067 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990010 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.990256 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990240 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.990297 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990257 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.990486 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990463 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.990486 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990487 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.990794 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990773 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.990794 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.990794 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.991026 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991009 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.991081 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991028 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.991174 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991148 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:23.991276 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991255 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.991326 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991277 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.991493 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991475 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.991552 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991495 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.991552 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991534 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="init-config-reloader" Apr 18 02:50:23.991552 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991549 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="init-config-reloader" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991562 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991571 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991581 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-thanos" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991590 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-thanos" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991603 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="thanos-sidecar" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991611 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="thanos-sidecar" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991624 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-web" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991652 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-web" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991664 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="config-reloader" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991672 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="config-reloader" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991687 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="prometheus" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991695 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="prometheus" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991704 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.991721 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991717 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991752 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991764 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-thanos" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991774 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="prometheus" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991783 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="config-reloader" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991792 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="kube-rbac-proxy-web" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991802 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" containerName="thanos-sidecar" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991903 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.991920 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992152 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.992203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992167 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.992539 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992345 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.992539 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992362 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.992626 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992537 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.992626 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992549 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.992798 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992772 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.992852 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992801 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.993004 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.992987 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.993068 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993006 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.993216 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993200 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.993279 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993217 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.993444 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993427 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.993496 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993446 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.993620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993601 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.993678 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993621 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.993845 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.993881 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.993845 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.994042 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994025 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.994085 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994042 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.994272 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994253 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.994314 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994275 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.994494 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994475 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.994544 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994496 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.994747 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994729 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.994808 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994748 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.994962 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994947 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.995012 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.994963 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.995149 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995135 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.995149 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995147 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.995312 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995299 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.995356 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995312 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.995484 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995467 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.995484 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995483 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.995669 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995655 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.995669 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995668 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.995848 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995829 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.995894 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.995849 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.996070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996039 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.996070 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996062 2575 scope.go:117] "RemoveContainer" containerID="6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8" Apr 18 02:50:23.996278 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996263 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8"} err="failed to get container status \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": rpc error: code = NotFound desc = could not find container \"6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8\": container with ID starting with 6a7ec3fe53bace2f77afb6469823073d5518ea84c5a102263837ce6d6ae14eb8 not found: ID does not exist" Apr 18 02:50:23.996328 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996279 2575 scope.go:117] "RemoveContainer" containerID="872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02" Apr 18 02:50:23.996478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996460 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02"} err="failed to get container status \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": rpc error: code = NotFound desc = could not find container \"872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02\": container with ID starting with 872db6114b0d6c984f50e251167012dd3ef5fc74e38c330a830993c79ddaef02 not found: ID does not exist" Apr 18 02:50:23.996521 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996479 2575 scope.go:117] "RemoveContainer" containerID="75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4" Apr 18 02:50:23.996695 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996677 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4"} err="failed to get container status \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": rpc error: code = NotFound desc = could not find container \"75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4\": container with ID starting with 75dff6f6d456fb4aff9f23b2a098e1626bffd77097fc3ed4d711a73757a95bf4 not found: ID does not exist" Apr 18 02:50:23.996736 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996695 2575 scope.go:117] "RemoveContainer" containerID="715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a" Apr 18 02:50:23.996895 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996876 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a"} err="failed to get container status \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": rpc error: code = NotFound desc = could not find container \"715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a\": container with ID starting with 715085a1701ba5c630f7a9a3610eb31a9df673a71d2bdb353bb9a00c25ac542a not found: ID does not exist" Apr 18 02:50:23.996969 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.996898 2575 scope.go:117] "RemoveContainer" containerID="260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39" Apr 18 02:50:23.997102 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997088 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39"} err="failed to get container status \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": rpc error: code = NotFound desc = could not find container \"260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39\": container with ID starting with 260562ed1527b11288e46a4478f92fc274a4f613cf968500829ccdc2ae80ae39 not found: ID does not exist" Apr 18 02:50:23.997151 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997103 2575 scope.go:117] "RemoveContainer" containerID="0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf" Apr 18 02:50:23.997288 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997273 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf"} err="failed to get container status \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": rpc error: code = NotFound desc = could not find container \"0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf\": container with ID starting with 0e2e1443acfeb637934821187797ce8cba16c10c04f25d6da8983252173729bf not found: ID does not exist" Apr 18 02:50:23.997333 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997288 2575 scope.go:117] "RemoveContainer" containerID="c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7" Apr 18 02:50:23.997442 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997427 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7"} err="failed to get container status \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": rpc error: code = NotFound desc = could not find container \"c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7\": container with ID starting with c50d6be04a7d6dd48d0f09f760e1248db38f625472417cfb5079e0d270cf7bd7 not found: ID does not exist" Apr 18 02:50:23.997537 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.997523 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:23.999668 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.999612 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 18 02:50:23.999863 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.999846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bjapjtdg1ard9\"" Apr 18 02:50:23.999944 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.999903 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 18 02:50:24.000000 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:23.999963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 18 02:50:24.000195 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 18 02:50:24.000252 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 18 02:50:24.000252 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-swkgn\"" Apr 18 02:50:24.000454 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 18 02:50:24.000510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 18 02:50:24.000996 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 18 02:50:24.001069 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.000975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 18 02:50:24.001069 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.001050 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 18 02:50:24.001214 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.001201 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 18 02:50:24.002558 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.002541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 18 02:50:24.006252 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.006230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 18 02:50:24.006723 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.006705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:24.066212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-web-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h5lw\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-kube-api-access-6h5lw\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.066577 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.067215 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.067215 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-config-out\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.067215 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.067215 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.067215 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.066720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167489 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167489 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167710 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167710 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167710 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-web-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.167868 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h5lw\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-kube-api-access-6h5lw\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.167993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-config-out\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168140 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168515 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.168515 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.169324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.168733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.169324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.169159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.172052 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.172004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.172601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.172257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c4340655-e704-4963-8cbc-e0834cc2dc62-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.172601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.172540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.172601 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.172553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.173246 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.173222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4340655-e704-4963-8cbc-e0834cc2dc62-config-out\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.173738 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.173691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.174248 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.174210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.174403 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.174382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-web-config\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.174857 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.174836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.174955 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.174838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.175319 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.175282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.175873 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.175833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c4340655-e704-4963-8cbc-e0834cc2dc62-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.176783 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.176765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h5lw\" (UniqueName: \"kubernetes.io/projected/c4340655-e704-4963-8cbc-e0834cc2dc62-kube-api-access-6h5lw\") pod \"prometheus-k8s-0\" (UID: \"c4340655-e704-4963-8cbc-e0834cc2dc62\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.308040 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.308002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:24.468397 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.468294 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 18 02:50:24.473279 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:50:24.473247 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4340655_e704_4963_8cbc_e0834cc2dc62.slice/crio-85a9be544361aaee6772fb32a5cb64ba2336c8f37fcaa25961d21a98cf97d16a WatchSource:0}: Error finding container 85a9be544361aaee6772fb32a5cb64ba2336c8f37fcaa25961d21a98cf97d16a: Status 404 returned error can't find the container with id 85a9be544361aaee6772fb32a5cb64ba2336c8f37fcaa25961d21a98cf97d16a Apr 18 02:50:24.946078 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.946040 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4340655-e704-4963-8cbc-e0834cc2dc62" containerID="e7283a59b5615668c9f260481312ade848afa43faf6936b31c307bdc6eb0acd0" exitCode=0 Apr 18 02:50:24.946249 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.946085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerDied","Data":"e7283a59b5615668c9f260481312ade848afa43faf6936b31c307bdc6eb0acd0"} Apr 18 02:50:24.946249 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:24.946112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"85a9be544361aaee6772fb32a5cb64ba2336c8f37fcaa25961d21a98cf97d16a"} Apr 18 02:50:25.179660 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.177933 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea22cbf-4a31-459d-9ed7-bd889e3f7832" path="/var/lib/kubelet/pods/2ea22cbf-4a31-459d-9ed7-bd889e3f7832/volumes" Apr 18 02:50:25.953916 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.953142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-znc79" event={"ID":"6e67776e-f785-425c-9c6b-56b4c10fccaa","Type":"ContainerStarted","Data":"c5702b91a5563a12017ceafebf74aeb0d3ca6cdc008b28f078282efd033fe161"} Apr 18 02:50:25.955828 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.955800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" event={"ID":"3c20ea1d-8a16-44d1-8cb9-0310ba00246c","Type":"ContainerStarted","Data":"93d60f73816a62d9705b157acc304d3309461be301d6ae35fc8852e8b30d2460"} Apr 18 02:50:25.959533 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.959506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"db465f45f5b88f8489a65eda0d6851de56eed8b16d7c9ebd7714bd0097ce05f8"} Apr 18 02:50:25.959659 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.959539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"8722394dcc29848160f94fc2068285ef03022173a0995e2baf85645d3601b9d5"} Apr 18 02:50:25.962000 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.961971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svnzk" event={"ID":"70a58891-387a-4f51-a0bf-e2abf38cf891","Type":"ContainerStarted","Data":"fc7ed9ec8af98735dff817727428fe9645f033999444d815bd305c098a4a55a2"} Apr 18 02:50:25.968827 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:25.968387 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-znc79" podStartSLOduration=251.133704017 podStartE2EDuration="4m12.968371514s" podCreationTimestamp="2026-04-18 02:46:13 +0000 UTC" firstStartedPulling="2026-04-18 02:50:23.900653818 +0000 UTC m=+283.407116070" lastFinishedPulling="2026-04-18 02:50:25.7353213 +0000 UTC m=+285.241783567" observedRunningTime="2026-04-18 02:50:25.96749095 +0000 UTC m=+285.473953229" watchObservedRunningTime="2026-04-18 02:50:25.968371514 +0000 UTC m=+285.474833789" Apr 18 02:50:26.966581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.966547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svnzk" event={"ID":"70a58891-387a-4f51-a0bf-e2abf38cf891","Type":"ContainerStarted","Data":"cf415209391106f71211fcd7ff4a38792bc7f5552147798bf7422bc803ccedb6"} Apr 18 02:50:26.967226 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.966621 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:26.969560 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.969531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"202fbff089c3636298310f998a32c1993c55a243ccadea0955f7990c807c4735"} Apr 18 02:50:26.969560 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.969562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"2523906bb4d4eaf032e911b7ac234b2580f185baeac22bee0c66cb567006a3d6"} Apr 18 02:50:26.969757 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.969578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"f53db67bb5c6333301b7159424dfa952fd2756d85031c395f83c03fb8bf1584a"} Apr 18 02:50:26.969757 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.969593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c4340655-e704-4963-8cbc-e0834cc2dc62","Type":"ContainerStarted","Data":"555546faf95a9a22d284f811a6be930c9c2d7aa9a607a54d982808eca575f7a4"} Apr 18 02:50:26.985031 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.984987 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-svnzk" podStartSLOduration=251.925193523 podStartE2EDuration="4m13.984973845s" podCreationTimestamp="2026-04-18 02:46:13 +0000 UTC" firstStartedPulling="2026-04-18 02:50:23.673661201 +0000 UTC m=+283.180123452" lastFinishedPulling="2026-04-18 02:50:25.733441519 +0000 UTC m=+285.239903774" observedRunningTime="2026-04-18 02:50:26.983786073 +0000 UTC m=+286.490248349" watchObservedRunningTime="2026-04-18 02:50:26.984973845 +0000 UTC m=+286.491436118" Apr 18 02:50:26.985596 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:26.985571 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bfhrs" podStartSLOduration=256.153598363 podStartE2EDuration="4m17.985563894s" podCreationTimestamp="2026-04-18 02:46:09 +0000 UTC" firstStartedPulling="2026-04-18 02:50:23.900123523 +0000 UTC m=+283.406585789" lastFinishedPulling="2026-04-18 02:50:25.732089068 +0000 UTC m=+285.238551320" observedRunningTime="2026-04-18 02:50:25.984597828 +0000 UTC m=+285.491060083" watchObservedRunningTime="2026-04-18 02:50:26.985563894 +0000 UTC m=+286.492026244" Apr 18 02:50:27.007555 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:27.007518 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.007504682 podStartE2EDuration="4.007504682s" podCreationTimestamp="2026-04-18 02:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:50:27.005288167 +0000 UTC m=+286.511750442" watchObservedRunningTime="2026-04-18 02:50:27.007504682 +0000 UTC m=+286.513966956" Apr 18 02:50:29.308415 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:29.308376 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:50:36.975673 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:36.975618 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-svnzk" Apr 18 02:50:41.128423 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:50:41.128398 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 18 02:51:24.308693 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:51:24.308654 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:51:24.323862 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:51:24.323834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:51:25.150740 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:51:25.150709 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 18 02:53:34.661622 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.661591 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc"] Apr 18 02:53:34.664515 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.664499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.667620 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.667587 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 18 02:53:34.667771 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.667622 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 18 02:53:34.667898 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.667879 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x27k\"" Apr 18 02:53:34.667965 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.667942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 18 02:53:34.668202 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.668181 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 18 02:53:34.675112 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.675088 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc"] Apr 18 02:53:34.701403 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.701376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.701510 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.701440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.701559 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.701506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vbmr\" (UniqueName: \"kubernetes.io/projected/1916a437-f847-43b2-8bce-a5201c06792c-kube-api-access-4vbmr\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.802780 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.802748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vbmr\" (UniqueName: \"kubernetes.io/projected/1916a437-f847-43b2-8bce-a5201c06792c-kube-api-access-4vbmr\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.802938 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.802793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.802938 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.802854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.805358 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.805336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.805463 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.805359 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1916a437-f847-43b2-8bce-a5201c06792c-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.813445 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.813422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vbmr\" (UniqueName: \"kubernetes.io/projected/1916a437-f847-43b2-8bce-a5201c06792c-kube-api-access-4vbmr\") pod \"opendatahub-operator-controller-manager-b6bf46549-slrnc\" (UID: \"1916a437-f847-43b2-8bce-a5201c06792c\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:34.981262 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:34.981171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:35.102740 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:35.102701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc"] Apr 18 02:53:35.106364 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:53:35.106337 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1916a437_f847_43b2_8bce_a5201c06792c.slice/crio-3705b10e644aa512c1b2fcf73ed5f0fc7fcd488734890adb57be46dc034a9586 WatchSource:0}: Error finding container 3705b10e644aa512c1b2fcf73ed5f0fc7fcd488734890adb57be46dc034a9586: Status 404 returned error can't find the container with id 3705b10e644aa512c1b2fcf73ed5f0fc7fcd488734890adb57be46dc034a9586 Apr 18 02:53:35.107925 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:35.107905 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:53:35.495705 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:35.495671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" event={"ID":"1916a437-f847-43b2-8bce-a5201c06792c","Type":"ContainerStarted","Data":"3705b10e644aa512c1b2fcf73ed5f0fc7fcd488734890adb57be46dc034a9586"} Apr 18 02:53:38.509187 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:38.509153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" event={"ID":"1916a437-f847-43b2-8bce-a5201c06792c","Type":"ContainerStarted","Data":"6bfaa98689091b70eb1881b51dbb95d520fac937d3982aa191851d5b73e85507"} Apr 18 02:53:38.509610 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:38.509257 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:38.531086 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:38.531025 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" podStartSLOduration=2.028737165 podStartE2EDuration="4.53100701s" podCreationTimestamp="2026-04-18 02:53:34 +0000 UTC" firstStartedPulling="2026-04-18 02:53:35.108059102 +0000 UTC m=+474.614521357" lastFinishedPulling="2026-04-18 02:53:37.610328946 +0000 UTC m=+477.116791202" observedRunningTime="2026-04-18 02:53:38.527599501 +0000 UTC m=+478.034061776" watchObservedRunningTime="2026-04-18 02:53:38.53100701 +0000 UTC m=+478.037469285" Apr 18 02:53:49.514250 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:49.514219 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-slrnc" Apr 18 02:53:52.926460 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.926422 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq"] Apr 18 02:53:52.937927 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.937900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:52.940778 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.940741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq"] Apr 18 02:53:52.941914 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.941890 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 18 02:53:52.942033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.941954 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 18 02:53:52.942033 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.941997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4ls7q\"" Apr 18 02:53:52.942142 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.941892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 18 02:53:52.942362 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:52.942344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 18 02:53:53.048478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.048446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tls-certs\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.048604 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.048484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tmp\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.048604 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.048518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92x4p\" (UniqueName: \"kubernetes.io/projected/6412bbcc-3769-49a0-9b29-2b79c0dfd386-kube-api-access-92x4p\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.149919 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.149890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92x4p\" (UniqueName: \"kubernetes.io/projected/6412bbcc-3769-49a0-9b29-2b79c0dfd386-kube-api-access-92x4p\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.150053 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.149946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tls-certs\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.150053 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.149971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tmp\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.152112 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.152084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tmp\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.152439 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.152417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6412bbcc-3769-49a0-9b29-2b79c0dfd386-tls-certs\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.157015 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.156991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92x4p\" (UniqueName: \"kubernetes.io/projected/6412bbcc-3769-49a0-9b29-2b79c0dfd386-kube-api-access-92x4p\") pod \"kube-auth-proxy-79f769b49f-lfmmq\" (UID: \"6412bbcc-3769-49a0-9b29-2b79c0dfd386\") " pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.248841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.248774 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" Apr 18 02:53:53.365282 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.365255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq"] Apr 18 02:53:53.367286 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:53:53.367255 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6412bbcc_3769_49a0_9b29_2b79c0dfd386.slice/crio-b64ce9d0bd1c904aae35af351e850af2702d38578e4b3b97b0dc9f2d59d8c026 WatchSource:0}: Error finding container b64ce9d0bd1c904aae35af351e850af2702d38578e4b3b97b0dc9f2d59d8c026: Status 404 returned error can't find the container with id b64ce9d0bd1c904aae35af351e850af2702d38578e4b3b97b0dc9f2d59d8c026 Apr 18 02:53:53.560148 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:53.560115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" event={"ID":"6412bbcc-3769-49a0-9b29-2b79c0dfd386","Type":"ContainerStarted","Data":"b64ce9d0bd1c904aae35af351e850af2702d38578e4b3b97b0dc9f2d59d8c026"} Apr 18 02:53:56.575107 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.575068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" event={"ID":"6412bbcc-3769-49a0-9b29-2b79c0dfd386","Type":"ContainerStarted","Data":"30642668494d9c94e19b29721542214e36f8f9f89b46fb748b9a89d4a5db28b2"} Apr 18 02:53:56.593840 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.593745 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-79f769b49f-lfmmq" podStartSLOduration=1.627602903 podStartE2EDuration="4.593732441s" podCreationTimestamp="2026-04-18 02:53:52 +0000 UTC" firstStartedPulling="2026-04-18 02:53:53.369071859 +0000 UTC m=+492.875534112" lastFinishedPulling="2026-04-18 02:53:56.335201398 +0000 UTC m=+495.841663650" observedRunningTime="2026-04-18 02:53:56.592050766 +0000 UTC m=+496.098513040" watchObservedRunningTime="2026-04-18 02:53:56.593732441 +0000 UTC m=+496.100194714" Apr 18 02:53:56.737865 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.737833 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7"] Apr 18 02:53:56.740798 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.740774 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.744933 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.744909 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 18 02:53:56.745754 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.745206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pnqpv\"" Apr 18 02:53:56.746004 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.745312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 18 02:53:56.746121 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.745444 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 18 02:53:56.746240 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.745482 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 18 02:53:56.746240 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.745519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:53:56.752152 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.752131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7"] Apr 18 02:53:56.883682 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.883580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96wkz\" (UniqueName: \"kubernetes.io/projected/bde75da2-0078-424b-bc43-b20c14e7b952-kube-api-access-96wkz\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.883682 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.883616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.883878 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.883741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bde75da2-0078-424b-bc43-b20c14e7b952-manager-config\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.883878 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.883798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.984972 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.984930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.985160 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.985007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96wkz\" (UniqueName: \"kubernetes.io/projected/bde75da2-0078-424b-bc43-b20c14e7b952-kube-api-access-96wkz\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.985160 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.985026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.985160 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.985054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bde75da2-0078-424b-bc43-b20c14e7b952-manager-config\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.985784 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.985765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bde75da2-0078-424b-bc43-b20c14e7b952-manager-config\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.987441 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.987421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.987572 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.987549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bde75da2-0078-424b-bc43-b20c14e7b952-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:56.992178 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:56.992155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96wkz\" (UniqueName: \"kubernetes.io/projected/bde75da2-0078-424b-bc43-b20c14e7b952-kube-api-access-96wkz\") pod \"lws-controller-manager-5dd789dc9-vhqp7\" (UID: \"bde75da2-0078-424b-bc43-b20c14e7b952\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:57.054236 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:57.054202 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:57.170236 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:57.170166 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7"] Apr 18 02:53:57.172996 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:53:57.172953 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde75da2_0078_424b_bc43_b20c14e7b952.slice/crio-ad186e0dd48af66878cbe4ccf74b3d895d855b600e1b7ca8860a9ec7c9b980be WatchSource:0}: Error finding container ad186e0dd48af66878cbe4ccf74b3d895d855b600e1b7ca8860a9ec7c9b980be: Status 404 returned error can't find the container with id ad186e0dd48af66878cbe4ccf74b3d895d855b600e1b7ca8860a9ec7c9b980be Apr 18 02:53:57.579969 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:57.579933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" event={"ID":"bde75da2-0078-424b-bc43-b20c14e7b952","Type":"ContainerStarted","Data":"ad186e0dd48af66878cbe4ccf74b3d895d855b600e1b7ca8860a9ec7c9b980be"} Apr 18 02:53:59.588091 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:59.588043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" event={"ID":"bde75da2-0078-424b-bc43-b20c14e7b952","Type":"ContainerStarted","Data":"537e61843eb2806be0aafdd0188c00bfdeae60ca82be6e9fb30cd285fd3ab0d2"} Apr 18 02:53:59.588508 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:59.588181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:53:59.602622 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:53:59.602574 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" podStartSLOduration=1.307784348 podStartE2EDuration="3.602560638s" podCreationTimestamp="2026-04-18 02:53:56 +0000 UTC" firstStartedPulling="2026-04-18 02:53:57.174689293 +0000 UTC m=+496.681151545" lastFinishedPulling="2026-04-18 02:53:59.469465579 +0000 UTC m=+498.975927835" observedRunningTime="2026-04-18 02:53:59.601447746 +0000 UTC m=+499.107910029" watchObservedRunningTime="2026-04-18 02:53:59.602560638 +0000 UTC m=+499.109022912" Apr 18 02:54:10.594189 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:54:10.594157 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-vhqp7" Apr 18 02:55:34.031755 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.031720 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v"] Apr 18 02:55:34.033982 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.033966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.036461 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.036437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 18 02:55:34.037358 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.037335 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 18 02:55:34.037478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.037368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 18 02:55:34.037478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.037380 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 18 02:55:34.037478 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.037336 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b57hm\"" Apr 18 02:55:34.042915 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.042892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v"] Apr 18 02:55:34.109820 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.109791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt42l\" (UniqueName: \"kubernetes.io/projected/d2fa858f-6f36-4082-8c99-5e5c7ae19770-kube-api-access-bt42l\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.109929 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.109829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fa858f-6f36-4082-8c99-5e5c7ae19770-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.109968 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.109950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d2fa858f-6f36-4082-8c99-5e5c7ae19770-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.211355 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.211326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bt42l\" (UniqueName: \"kubernetes.io/projected/d2fa858f-6f36-4082-8c99-5e5c7ae19770-kube-api-access-bt42l\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.211508 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.211361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fa858f-6f36-4082-8c99-5e5c7ae19770-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.211508 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.211425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d2fa858f-6f36-4082-8c99-5e5c7ae19770-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.212131 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.212108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d2fa858f-6f36-4082-8c99-5e5c7ae19770-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.213854 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.213824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2fa858f-6f36-4082-8c99-5e5c7ae19770-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.222390 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.222367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt42l\" (UniqueName: \"kubernetes.io/projected/d2fa858f-6f36-4082-8c99-5e5c7ae19770-kube-api-access-bt42l\") pod \"kuadrant-console-plugin-6cb54b5c86-6zg7v\" (UID: \"d2fa858f-6f36-4082-8c99-5e5c7ae19770\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.350222 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.350153 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" Apr 18 02:55:34.479496 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.479466 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v"] Apr 18 02:55:34.483649 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:55:34.483600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2fa858f_6f36_4082_8c99_5e5c7ae19770.slice/crio-e5a97cf197dc17ed9eddd324a250ba9e967207e03d1fc2afb6d9b5cf0b53caf0 WatchSource:0}: Error finding container e5a97cf197dc17ed9eddd324a250ba9e967207e03d1fc2afb6d9b5cf0b53caf0: Status 404 returned error can't find the container with id e5a97cf197dc17ed9eddd324a250ba9e967207e03d1fc2afb6d9b5cf0b53caf0 Apr 18 02:55:34.888238 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:34.888205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" event={"ID":"d2fa858f-6f36-4082-8c99-5e5c7ae19770","Type":"ContainerStarted","Data":"e5a97cf197dc17ed9eddd324a250ba9e967207e03d1fc2afb6d9b5cf0b53caf0"} Apr 18 02:55:58.979611 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:58.979570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" event={"ID":"d2fa858f-6f36-4082-8c99-5e5c7ae19770","Type":"ContainerStarted","Data":"465b6fccf510cee8a78fd3e35eda83db0e343b10638f1b1800c1c503443d632b"} Apr 18 02:55:58.994544 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:55:58.994499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6zg7v" podStartSLOduration=1.587257997 podStartE2EDuration="24.994485732s" podCreationTimestamp="2026-04-18 02:55:34 +0000 UTC" firstStartedPulling="2026-04-18 02:55:34.484966179 +0000 UTC m=+593.991428447" lastFinishedPulling="2026-04-18 02:55:57.892193915 +0000 UTC m=+617.398656182" observedRunningTime="2026-04-18 02:55:58.993402187 +0000 UTC m=+618.499864462" watchObservedRunningTime="2026-04-18 02:55:58.994485732 +0000 UTC m=+618.500948007" Apr 18 02:56:18.129482 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.129445 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:18.132349 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.132329 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.134621 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.134598 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 18 02:56:18.139800 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.139779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:18.198714 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.198686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9fq\" (UniqueName: \"kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.198833 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.198730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.230214 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.230188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:18.299471 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.299446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9fq\" (UniqueName: \"kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.299593 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.299484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.300041 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.300024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.310326 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.310296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9fq\" (UniqueName: \"kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq\") pod \"limitador-limitador-7d549b5b-rth65\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.443149 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.443074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:18.556430 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.556399 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:18.559589 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:56:18.559561 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ed5799_5217_49d3_aa5c_af1e2e50d3ef.slice/crio-7602a8ebeeae2bb62651d9d2264227e343451498026281675d15906d4d1fb995 WatchSource:0}: Error finding container 7602a8ebeeae2bb62651d9d2264227e343451498026281675d15906d4d1fb995: Status 404 returned error can't find the container with id 7602a8ebeeae2bb62651d9d2264227e343451498026281675d15906d4d1fb995 Apr 18 02:56:18.936114 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.936080 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:18.940220 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.940196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:18.942719 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.942697 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-gkf5k\"" Apr 18 02:56:18.945295 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:18.945273 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:19.005468 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.005435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh59c\" (UniqueName: \"kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c\") pod \"authorino-f99f4b5cd-x6c66\" (UID: \"28b30921-7ee4-4e7d-af4b-9547f78c7f6c\") " pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:19.042935 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.042908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" event={"ID":"19ed5799-5217-49d3-aa5c-af1e2e50d3ef","Type":"ContainerStarted","Data":"7602a8ebeeae2bb62651d9d2264227e343451498026281675d15906d4d1fb995"} Apr 18 02:56:19.106106 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.106075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh59c\" (UniqueName: \"kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c\") pod \"authorino-f99f4b5cd-x6c66\" (UID: \"28b30921-7ee4-4e7d-af4b-9547f78c7f6c\") " pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:19.114022 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.113992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh59c\" (UniqueName: \"kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c\") pod \"authorino-f99f4b5cd-x6c66\" (UID: \"28b30921-7ee4-4e7d-af4b-9547f78c7f6c\") " pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:19.251561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.251481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:19.411957 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:19.411931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:20.047961 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:20.047910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" event={"ID":"28b30921-7ee4-4e7d-af4b-9547f78c7f6c","Type":"ContainerStarted","Data":"29c648b7251b153d1e2515b63125fdc77ca48a3e885518a0f32e055b24d7a6c0"} Apr 18 02:56:22.485289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:22.485251 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:24.062378 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.062337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" event={"ID":"19ed5799-5217-49d3-aa5c-af1e2e50d3ef","Type":"ContainerStarted","Data":"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d"} Apr 18 02:56:24.062826 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.062462 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:24.063581 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.063556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" event={"ID":"28b30921-7ee4-4e7d-af4b-9547f78c7f6c","Type":"ContainerStarted","Data":"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d"} Apr 18 02:56:24.063672 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.063618 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" podUID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" containerName="authorino" containerID="cri-o://ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d" gracePeriod=30 Apr 18 02:56:24.079462 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.079420 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" podStartSLOduration=1.471574037 podStartE2EDuration="6.079409724s" podCreationTimestamp="2026-04-18 02:56:18 +0000 UTC" firstStartedPulling="2026-04-18 02:56:18.561451438 +0000 UTC m=+638.067913694" lastFinishedPulling="2026-04-18 02:56:23.169287115 +0000 UTC m=+642.675749381" observedRunningTime="2026-04-18 02:56:24.078345834 +0000 UTC m=+643.584808107" watchObservedRunningTime="2026-04-18 02:56:24.079409724 +0000 UTC m=+643.585871999" Apr 18 02:56:24.091113 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.091065 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" podStartSLOduration=2.328983791 podStartE2EDuration="6.091054823s" podCreationTimestamp="2026-04-18 02:56:18 +0000 UTC" firstStartedPulling="2026-04-18 02:56:19.418150867 +0000 UTC m=+638.924613137" lastFinishedPulling="2026-04-18 02:56:23.180221917 +0000 UTC m=+642.686684169" observedRunningTime="2026-04-18 02:56:24.090480827 +0000 UTC m=+643.596943101" watchObservedRunningTime="2026-04-18 02:56:24.091054823 +0000 UTC m=+643.597517098" Apr 18 02:56:24.306873 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.306852 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:24.455899 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.455802 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh59c\" (UniqueName: \"kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c\") pod \"28b30921-7ee4-4e7d-af4b-9547f78c7f6c\" (UID: \"28b30921-7ee4-4e7d-af4b-9547f78c7f6c\") " Apr 18 02:56:24.458040 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.458007 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c" (OuterVolumeSpecName: "kube-api-access-xh59c") pod "28b30921-7ee4-4e7d-af4b-9547f78c7f6c" (UID: "28b30921-7ee4-4e7d-af4b-9547f78c7f6c"). InnerVolumeSpecName "kube-api-access-xh59c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:24.563095 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:24.563057 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xh59c\" (UniqueName: \"kubernetes.io/projected/28b30921-7ee4-4e7d-af4b-9547f78c7f6c-kube-api-access-xh59c\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:56:25.067788 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.067755 2575 generic.go:358] "Generic (PLEG): container finished" podID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" containerID="ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d" exitCode=0 Apr 18 02:56:25.068212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.067804 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" Apr 18 02:56:25.068212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.067837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" event={"ID":"28b30921-7ee4-4e7d-af4b-9547f78c7f6c","Type":"ContainerDied","Data":"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d"} Apr 18 02:56:25.068212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.067874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x6c66" event={"ID":"28b30921-7ee4-4e7d-af4b-9547f78c7f6c","Type":"ContainerDied","Data":"29c648b7251b153d1e2515b63125fdc77ca48a3e885518a0f32e055b24d7a6c0"} Apr 18 02:56:25.068212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.067892 2575 scope.go:117] "RemoveContainer" containerID="ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d" Apr 18 02:56:25.076663 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.076621 2575 scope.go:117] "RemoveContainer" containerID="ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d" Apr 18 02:56:25.076942 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:56:25.076916 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d\": container with ID starting with ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d not found: ID does not exist" containerID="ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d" Apr 18 02:56:25.077001 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.076952 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d"} err="failed to get container status \"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d\": rpc error: code = NotFound desc = could not find container \"ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d\": container with ID starting with ff7944de86d2d0db0f96aa89918c63c43d31e7869542de37966f7e4438aef19d not found: ID does not exist" Apr 18 02:56:25.088426 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.088381 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:25.090120 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.090098 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x6c66"] Apr 18 02:56:25.175469 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:25.175437 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" path="/var/lib/kubelet/pods/28b30921-7ee4-4e7d-af4b-9547f78c7f6c/volumes" Apr 18 02:56:32.883212 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:32.883174 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:32.883768 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:32.883410 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" podUID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" containerName="limitador" containerID="cri-o://63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d" gracePeriod=30 Apr 18 02:56:32.884160 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:32.884033 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:33.422292 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.422267 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:33.538222 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.538189 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp9fq\" (UniqueName: \"kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq\") pod \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " Apr 18 02:56:33.538388 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.538328 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file\") pod \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\" (UID: \"19ed5799-5217-49d3-aa5c-af1e2e50d3ef\") " Apr 18 02:56:33.538711 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.538686 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file" (OuterVolumeSpecName: "config-file") pod "19ed5799-5217-49d3-aa5c-af1e2e50d3ef" (UID: "19ed5799-5217-49d3-aa5c-af1e2e50d3ef"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:56:33.540269 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.540241 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq" (OuterVolumeSpecName: "kube-api-access-wp9fq") pod "19ed5799-5217-49d3-aa5c-af1e2e50d3ef" (UID: "19ed5799-5217-49d3-aa5c-af1e2e50d3ef"). InnerVolumeSpecName "kube-api-access-wp9fq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:33.639894 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.639857 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wp9fq\" (UniqueName: \"kubernetes.io/projected/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-kube-api-access-wp9fq\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:56:33.639894 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:33.639887 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/19ed5799-5217-49d3-aa5c-af1e2e50d3ef-config-file\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:56:34.087589 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087557 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-d8gqj"] Apr 18 02:56:34.087957 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087884 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" containerName="limitador" Apr 18 02:56:34.087957 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087897 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" containerName="limitador" Apr 18 02:56:34.087957 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087906 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" containerName="authorino" Apr 18 02:56:34.087957 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087911 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" containerName="authorino" Apr 18 02:56:34.088093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087975 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="28b30921-7ee4-4e7d-af4b-9547f78c7f6c" containerName="authorino" Apr 18 02:56:34.088093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.087983 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" containerName="limitador" Apr 18 02:56:34.090024 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.090005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.092314 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.092287 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-ln5cp\"" Apr 18 02:56:34.092445 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.092294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 18 02:56:34.097499 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.097467 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d8gqj"] Apr 18 02:56:34.102311 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.102276 2575 generic.go:358] "Generic (PLEG): container finished" podID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" containerID="63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d" exitCode=0 Apr 18 02:56:34.102418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.102341 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" Apr 18 02:56:34.102418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.102355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" event={"ID":"19ed5799-5217-49d3-aa5c-af1e2e50d3ef","Type":"ContainerDied","Data":"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d"} Apr 18 02:56:34.102418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.102399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rth65" event={"ID":"19ed5799-5217-49d3-aa5c-af1e2e50d3ef","Type":"ContainerDied","Data":"7602a8ebeeae2bb62651d9d2264227e343451498026281675d15906d4d1fb995"} Apr 18 02:56:34.102591 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.102420 2575 scope.go:117] "RemoveContainer" containerID="63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d" Apr 18 02:56:34.112176 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.112141 2575 scope.go:117] "RemoveContainer" containerID="63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d" Apr 18 02:56:34.112465 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:56:34.112443 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d\": container with ID starting with 63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d not found: ID does not exist" containerID="63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d" Apr 18 02:56:34.112537 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.112471 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d"} err="failed to get container status \"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d\": rpc error: code = NotFound desc = could not find container \"63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d\": container with ID starting with 63f4c8d7547710563e8c46aa47960bb49634024e98dccb9e25b7a488bc25c64d not found: ID does not exist" Apr 18 02:56:34.122597 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.122565 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:34.126016 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.125993 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rth65"] Apr 18 02:56:34.245491 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.245454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0bc7b60d-c168-441f-91fe-a611a5cab969-data\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.245674 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.245520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlnn\" (UniqueName: \"kubernetes.io/projected/0bc7b60d-c168-441f-91fe-a611a5cab969-kube-api-access-qdlnn\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.346855 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.346777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0bc7b60d-c168-441f-91fe-a611a5cab969-data\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.346855 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.346844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlnn\" (UniqueName: \"kubernetes.io/projected/0bc7b60d-c168-441f-91fe-a611a5cab969-kube-api-access-qdlnn\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.347175 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.347155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0bc7b60d-c168-441f-91fe-a611a5cab969-data\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.354292 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.354262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlnn\" (UniqueName: \"kubernetes.io/projected/0bc7b60d-c168-441f-91fe-a611a5cab969-kube-api-access-qdlnn\") pod \"postgres-868db5846d-d8gqj\" (UID: \"0bc7b60d-c168-441f-91fe-a611a5cab969\") " pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.402869 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.402843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:34.524153 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:34.524118 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d8gqj"] Apr 18 02:56:34.527130 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:56:34.527103 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc7b60d_c168_441f_91fe_a611a5cab969.slice/crio-cfb89cbd40c649a94defdb241575b577ff286abe41d0c32e0365e3e7a9ecb80c WatchSource:0}: Error finding container cfb89cbd40c649a94defdb241575b577ff286abe41d0c32e0365e3e7a9ecb80c: Status 404 returned error can't find the container with id cfb89cbd40c649a94defdb241575b577ff286abe41d0c32e0365e3e7a9ecb80c Apr 18 02:56:35.106306 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:35.106268 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d8gqj" event={"ID":"0bc7b60d-c168-441f-91fe-a611a5cab969","Type":"ContainerStarted","Data":"cfb89cbd40c649a94defdb241575b577ff286abe41d0c32e0365e3e7a9ecb80c"} Apr 18 02:56:35.175723 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:35.175694 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ed5799-5217-49d3-aa5c-af1e2e50d3ef" path="/var/lib/kubelet/pods/19ed5799-5217-49d3-aa5c-af1e2e50d3ef/volumes" Apr 18 02:56:40.125446 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:40.125406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d8gqj" event={"ID":"0bc7b60d-c168-441f-91fe-a611a5cab969","Type":"ContainerStarted","Data":"2865bfde2cee953c450e426a06dddd8fb01addbe87dbeca67acce01d1b11a3e6"} Apr 18 02:56:40.125860 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:40.125559 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:40.140999 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:40.140953 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-d8gqj" podStartSLOduration=0.944541939 podStartE2EDuration="6.140939534s" podCreationTimestamp="2026-04-18 02:56:34 +0000 UTC" firstStartedPulling="2026-04-18 02:56:34.528708143 +0000 UTC m=+654.035170395" lastFinishedPulling="2026-04-18 02:56:39.725105738 +0000 UTC m=+659.231567990" observedRunningTime="2026-04-18 02:56:40.138614113 +0000 UTC m=+659.645076389" watchObservedRunningTime="2026-04-18 02:56:40.140939534 +0000 UTC m=+659.647401808" Apr 18 02:56:46.156385 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.156349 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-d8gqj" Apr 18 02:56:46.688134 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.688101 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:46.694388 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.694365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:46.696743 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.696717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:46.697013 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.696995 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-gkf5k\"" Apr 18 02:56:46.750991 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.750962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxrgv\" (UniqueName: \"kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv\") pod \"authorino-8b475cf9f-z48lq\" (UID: \"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95\") " pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:46.851681 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.851652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxrgv\" (UniqueName: \"kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv\") pod \"authorino-8b475cf9f-z48lq\" (UID: \"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95\") " pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:46.858969 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.858941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxrgv\" (UniqueName: \"kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv\") pod \"authorino-8b475cf9f-z48lq\" (UID: \"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95\") " pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:46.920515 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.920488 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:46.920712 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.920699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:46.946194 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.946128 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:56:46.951414 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.951395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:46.953979 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.953958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 18 02:56:46.955410 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:46.955387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:56:47.036992 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.036916 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:47.039212 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:56:47.039182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9bc7f9_7b20_4b0b_96c3_cdcf5b254c95.slice/crio-aaaa304a6c650cc804b384059cd2b5a578e848f44b1d44bcd3d58b762bfbb6f9 WatchSource:0}: Error finding container aaaa304a6c650cc804b384059cd2b5a578e848f44b1d44bcd3d58b762bfbb6f9: Status 404 returned error can't find the container with id aaaa304a6c650cc804b384059cd2b5a578e848f44b1d44bcd3d58b762bfbb6f9 Apr 18 02:56:47.054060 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.054034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474j5\" (UniqueName: \"kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.054156 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.054097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.148276 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.148244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-z48lq" event={"ID":"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95","Type":"ContainerStarted","Data":"aaaa304a6c650cc804b384059cd2b5a578e848f44b1d44bcd3d58b762bfbb6f9"} Apr 18 02:56:47.154671 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.154646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.154738 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.154700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-474j5\" (UniqueName: \"kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.156975 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.156953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.162129 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.162105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-474j5\" (UniqueName: \"kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5\") pod \"authorino-7b7fb46454-4fr88\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.263237 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.263208 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:56:47.378203 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:47.378173 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:56:47.379443 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:56:47.379417 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bd41f15_d9ba_415b_b475_f1f3fe80795f.slice/crio-059534f9bd91e61a7691094f5d1f604b0453d4e0e917d452c4ba9cebb9534506 WatchSource:0}: Error finding container 059534f9bd91e61a7691094f5d1f604b0453d4e0e917d452c4ba9cebb9534506: Status 404 returned error can't find the container with id 059534f9bd91e61a7691094f5d1f604b0453d4e0e917d452c4ba9cebb9534506 Apr 18 02:56:48.153019 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.152978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-z48lq" event={"ID":"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95","Type":"ContainerStarted","Data":"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba"} Apr 18 02:56:48.153019 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.153009 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-z48lq" podUID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" containerName="authorino" containerID="cri-o://4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba" gracePeriod=30 Apr 18 02:56:48.154324 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.154301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7fb46454-4fr88" event={"ID":"1bd41f15-d9ba-415b-b475-f1f3fe80795f","Type":"ContainerStarted","Data":"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3"} Apr 18 02:56:48.154432 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.154327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7fb46454-4fr88" event={"ID":"1bd41f15-d9ba-415b-b475-f1f3fe80795f","Type":"ContainerStarted","Data":"059534f9bd91e61a7691094f5d1f604b0453d4e0e917d452c4ba9cebb9534506"} Apr 18 02:56:48.167710 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.167658 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-z48lq" podStartSLOduration=1.8108468530000001 podStartE2EDuration="2.167621933s" podCreationTimestamp="2026-04-18 02:56:46 +0000 UTC" firstStartedPulling="2026-04-18 02:56:47.040453662 +0000 UTC m=+666.546915915" lastFinishedPulling="2026-04-18 02:56:47.397228726 +0000 UTC m=+666.903690995" observedRunningTime="2026-04-18 02:56:48.16569781 +0000 UTC m=+667.672160084" watchObservedRunningTime="2026-04-18 02:56:48.167621933 +0000 UTC m=+667.674084211" Apr 18 02:56:48.180432 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.180385 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7b7fb46454-4fr88" podStartSLOduration=1.84382871 podStartE2EDuration="2.180372575s" podCreationTimestamp="2026-04-18 02:56:46 +0000 UTC" firstStartedPulling="2026-04-18 02:56:47.38080353 +0000 UTC m=+666.887265782" lastFinishedPulling="2026-04-18 02:56:47.717347382 +0000 UTC m=+667.223809647" observedRunningTime="2026-04-18 02:56:48.178938319 +0000 UTC m=+667.685400594" watchObservedRunningTime="2026-04-18 02:56:48.180372575 +0000 UTC m=+667.686834849" Apr 18 02:56:48.386912 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.386887 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:48.466439 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.466365 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxrgv\" (UniqueName: \"kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv\") pod \"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95\" (UID: \"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95\") " Apr 18 02:56:48.468452 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.468414 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv" (OuterVolumeSpecName: "kube-api-access-nxrgv") pod "4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" (UID: "4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95"). InnerVolumeSpecName "kube-api-access-nxrgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:56:48.567328 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:48.567283 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxrgv\" (UniqueName: \"kubernetes.io/projected/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95-kube-api-access-nxrgv\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:56:49.159144 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.159112 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" containerID="4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba" exitCode=0 Apr 18 02:56:49.159302 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.159163 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z48lq" Apr 18 02:56:49.159302 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.159196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-z48lq" event={"ID":"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95","Type":"ContainerDied","Data":"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba"} Apr 18 02:56:49.159302 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.159230 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-z48lq" event={"ID":"4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95","Type":"ContainerDied","Data":"aaaa304a6c650cc804b384059cd2b5a578e848f44b1d44bcd3d58b762bfbb6f9"} Apr 18 02:56:49.159302 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.159246 2575 scope.go:117] "RemoveContainer" containerID="4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba" Apr 18 02:56:49.168147 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.168130 2575 scope.go:117] "RemoveContainer" containerID="4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba" Apr 18 02:56:49.168407 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:56:49.168389 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba\": container with ID starting with 4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba not found: ID does not exist" containerID="4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba" Apr 18 02:56:49.168450 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.168416 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba"} err="failed to get container status \"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba\": rpc error: code = NotFound desc = could not find container \"4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba\": container with ID starting with 4423f89c6b27421d31c07cb0fc705ffb3f3193be5c229f17c5c5c98b7f03b6ba not found: ID does not exist" Apr 18 02:56:49.179050 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.179030 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:49.180842 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:49.180821 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z48lq"] Apr 18 02:56:51.175093 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:56:51.175059 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" path="/var/lib/kubelet/pods/4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95/volumes" Apr 18 02:57:47.567974 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.567939 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th"] Apr 18 02:57:47.568374 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.568253 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" containerName="authorino" Apr 18 02:57:47.568374 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.568264 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" containerName="authorino" Apr 18 02:57:47.568374 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.568327 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f9bc7f9-7b20-4b0b-96c3-cdcf5b254c95" containerName="authorino" Apr 18 02:57:47.572499 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.572481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.574852 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.574824 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-62ckn\"" Apr 18 02:57:47.574999 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.574834 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 18 02:57:47.575804 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.575779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 18 02:57:47.575922 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.575819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 18 02:57:47.578967 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.578942 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th"] Apr 18 02:57:47.672826 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.672798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.672971 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.672833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eeebad60-9e1d-4213-9ea7-dd47ba121203-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.672971 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.672924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz4ww\" (UniqueName: \"kubernetes.io/projected/eeebad60-9e1d-4213-9ea7-dd47ba121203-kube-api-access-kz4ww\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.673086 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.672995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.673086 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.673021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.673086 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.673042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.773683 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.773831 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.773831 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.773831 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.773831 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eeebad60-9e1d-4213-9ea7-dd47ba121203-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.774031 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.773837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz4ww\" (UniqueName: \"kubernetes.io/projected/eeebad60-9e1d-4213-9ea7-dd47ba121203-kube-api-access-kz4ww\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.774112 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.774088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.774173 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.774128 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.774211 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.774186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.776067 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.776039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eeebad60-9e1d-4213-9ea7-dd47ba121203-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.776270 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.776255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eeebad60-9e1d-4213-9ea7-dd47ba121203-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.782108 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.782090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz4ww\" (UniqueName: \"kubernetes.io/projected/eeebad60-9e1d-4213-9ea7-dd47ba121203-kube-api-access-kz4ww\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th\" (UID: \"eeebad60-9e1d-4213-9ea7-dd47ba121203\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:47.885419 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:47.885339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:57:48.004826 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:48.004801 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th"] Apr 18 02:57:48.007015 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:57:48.006984 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeebad60_9e1d_4213_9ea7_dd47ba121203.slice/crio-bde68623bfd61292e48c59af733d9203b1084ce699e6fb10f9ef40b89ef396dc WatchSource:0}: Error finding container bde68623bfd61292e48c59af733d9203b1084ce699e6fb10f9ef40b89ef396dc: Status 404 returned error can't find the container with id bde68623bfd61292e48c59af733d9203b1084ce699e6fb10f9ef40b89ef396dc Apr 18 02:57:48.339881 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:48.339838 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" event={"ID":"eeebad60-9e1d-4213-9ea7-dd47ba121203","Type":"ContainerStarted","Data":"bde68623bfd61292e48c59af733d9203b1084ce699e6fb10f9ef40b89ef396dc"} Apr 18 02:57:53.270772 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.270728 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls"] Apr 18 02:57:53.274594 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.274573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.276934 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.276914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 18 02:57:53.281047 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.281018 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls"] Apr 18 02:57:53.323280 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.323444 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.323444 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.323444 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3cd95c-286c-42d4-9720-4aa95005508d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.323561 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.323602 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.323575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crjk9\" (UniqueName: \"kubernetes.io/projected/fb3cd95c-286c-42d4-9720-4aa95005508d-kube-api-access-crjk9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.359418 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.359370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" event={"ID":"eeebad60-9e1d-4213-9ea7-dd47ba121203","Type":"ContainerStarted","Data":"2494da727aee1634ad62a2778c46d1b8684b5908f28f8ea6f8add70c62699a6b"} Apr 18 02:57:53.425138 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crjk9\" (UniqueName: \"kubernetes.io/projected/fb3cd95c-286c-42d4-9720-4aa95005508d-kube-api-access-crjk9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425322 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425322 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425322 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425715 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.425841 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3cd95c-286c-42d4-9720-4aa95005508d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.426019 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.425842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.427998 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.427968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb3cd95c-286c-42d4-9720-4aa95005508d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.428413 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.428394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3cd95c-286c-42d4-9720-4aa95005508d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.432606 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.432571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crjk9\" (UniqueName: \"kubernetes.io/projected/fb3cd95c-286c-42d4-9720-4aa95005508d-kube-api-access-crjk9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls\" (UID: \"fb3cd95c-286c-42d4-9720-4aa95005508d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.586524 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.586433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:57:53.710544 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:53.710518 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls"] Apr 18 02:57:53.713506 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:57:53.713465 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb3cd95c_286c_42d4_9720_4aa95005508d.slice/crio-a34770c12423c9bdb7617e1252801f2cb20716b7133cbce8b0fc6be6c0b14aa3 WatchSource:0}: Error finding container a34770c12423c9bdb7617e1252801f2cb20716b7133cbce8b0fc6be6c0b14aa3: Status 404 returned error can't find the container with id a34770c12423c9bdb7617e1252801f2cb20716b7133cbce8b0fc6be6c0b14aa3 Apr 18 02:57:54.364693 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:54.364651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" event={"ID":"fb3cd95c-286c-42d4-9720-4aa95005508d","Type":"ContainerStarted","Data":"de7e2644378d38baf8d2839b979ec3fc52c7b8186a8e76721851f565cf14cfdb"} Apr 18 02:57:54.365168 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:54.364698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" event={"ID":"fb3cd95c-286c-42d4-9720-4aa95005508d","Type":"ContainerStarted","Data":"a34770c12423c9bdb7617e1252801f2cb20716b7133cbce8b0fc6be6c0b14aa3"} Apr 18 02:57:59.382058 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:59.381967 2575 generic.go:358] "Generic (PLEG): container finished" podID="eeebad60-9e1d-4213-9ea7-dd47ba121203" containerID="2494da727aee1634ad62a2778c46d1b8684b5908f28f8ea6f8add70c62699a6b" exitCode=0 Apr 18 02:57:59.382058 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:59.382037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" event={"ID":"eeebad60-9e1d-4213-9ea7-dd47ba121203","Type":"ContainerDied","Data":"2494da727aee1634ad62a2778c46d1b8684b5908f28f8ea6f8add70c62699a6b"} Apr 18 02:57:59.383462 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:59.383242 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb3cd95c-286c-42d4-9720-4aa95005508d" containerID="de7e2644378d38baf8d2839b979ec3fc52c7b8186a8e76721851f565cf14cfdb" exitCode=0 Apr 18 02:57:59.383462 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:57:59.383304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" event={"ID":"fb3cd95c-286c-42d4-9720-4aa95005508d","Type":"ContainerDied","Data":"de7e2644378d38baf8d2839b979ec3fc52c7b8186a8e76721851f565cf14cfdb"} Apr 18 02:58:01.392283 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.392246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" event={"ID":"eeebad60-9e1d-4213-9ea7-dd47ba121203","Type":"ContainerStarted","Data":"8ccc00e1bb4b03e1ab4463d211e0ea2baf816f8fffa70b5814be7539229dbece"} Apr 18 02:58:01.392783 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.392502 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:58:01.393753 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.393731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" event={"ID":"fb3cd95c-286c-42d4-9720-4aa95005508d","Type":"ContainerStarted","Data":"597b6eb699dd2f0682c6f451b0f3f75d1b5399cd115e794e7d231c457202bb13"} Apr 18 02:58:01.393943 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.393926 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:58:01.410305 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.410256 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" podStartSLOduration=1.931839739 podStartE2EDuration="14.410243828s" podCreationTimestamp="2026-04-18 02:57:47 +0000 UTC" firstStartedPulling="2026-04-18 02:57:48.00887747 +0000 UTC m=+727.515339727" lastFinishedPulling="2026-04-18 02:58:00.487281564 +0000 UTC m=+739.993743816" observedRunningTime="2026-04-18 02:58:01.407740036 +0000 UTC m=+740.914202309" watchObservedRunningTime="2026-04-18 02:58:01.410243828 +0000 UTC m=+740.916706101" Apr 18 02:58:01.422833 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:01.422796 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" podStartSLOduration=7.323831062 podStartE2EDuration="8.422783776s" podCreationTimestamp="2026-04-18 02:57:53 +0000 UTC" firstStartedPulling="2026-04-18 02:57:59.383926546 +0000 UTC m=+738.890388798" lastFinishedPulling="2026-04-18 02:58:00.48287926 +0000 UTC m=+739.989341512" observedRunningTime="2026-04-18 02:58:01.422086379 +0000 UTC m=+740.928548652" watchObservedRunningTime="2026-04-18 02:58:01.422783776 +0000 UTC m=+740.929246049" Apr 18 02:58:12.409952 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:12.409922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls" Apr 18 02:58:12.410623 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:12.410600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th" Apr 18 02:58:42.388580 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.388542 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w"] Apr 18 02:58:42.392114 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.392095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.395719 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.395698 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 18 02:58:42.403289 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.402682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w"] Apr 18 02:58:42.462198 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjscn\" (UniqueName: \"kubernetes.io/projected/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kube-api-access-fjscn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.462342 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.462342 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.462421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.462421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.462421 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.462395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.563763 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.563719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.563896 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.563784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.563896 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.563803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.563963 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.563943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.564061 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.564043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjscn\" (UniqueName: \"kubernetes.io/projected/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kube-api-access-fjscn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.564115 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.564099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.564201 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.564182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.564267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.564231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.564267 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.564259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.566052 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.566028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.566401 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.566384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.570546 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.570520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjscn\" (UniqueName: \"kubernetes.io/projected/2cd373b0-e827-4b53-b7b7-db83b5ffdeeb-kube-api-access-fjscn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-7js5w\" (UID: \"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:42.706749 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:42.706663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:43.035348 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:43.035317 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w"] Apr 18 02:58:43.037708 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:58:43.037678 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd373b0_e827_4b53_b7b7_db83b5ffdeeb.slice/crio-3e896934d3ff4efd3135fe7fb7ba60cd83e132cdaac5a85508d4cf4a7004bf3c WatchSource:0}: Error finding container 3e896934d3ff4efd3135fe7fb7ba60cd83e132cdaac5a85508d4cf4a7004bf3c: Status 404 returned error can't find the container with id 3e896934d3ff4efd3135fe7fb7ba60cd83e132cdaac5a85508d4cf4a7004bf3c Apr 18 02:58:43.042538 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:43.042513 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:58:43.531028 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:43.530983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" event={"ID":"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb","Type":"ContainerStarted","Data":"0481b39518d23be8bb01930b4e361c203b2948767cd2a3cedeed171f4a8f6222"} Apr 18 02:58:43.531380 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:43.531035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" event={"ID":"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb","Type":"ContainerStarted","Data":"3e896934d3ff4efd3135fe7fb7ba60cd83e132cdaac5a85508d4cf4a7004bf3c"} Apr 18 02:58:51.565425 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:51.565388 2575 generic.go:358] "Generic (PLEG): container finished" podID="2cd373b0-e827-4b53-b7b7-db83b5ffdeeb" containerID="0481b39518d23be8bb01930b4e361c203b2948767cd2a3cedeed171f4a8f6222" exitCode=0 Apr 18 02:58:51.566022 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:51.565461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" event={"ID":"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb","Type":"ContainerDied","Data":"0481b39518d23be8bb01930b4e361c203b2948767cd2a3cedeed171f4a8f6222"} Apr 18 02:58:52.572393 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:52.572355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" event={"ID":"2cd373b0-e827-4b53-b7b7-db83b5ffdeeb","Type":"ContainerStarted","Data":"34208918ce9ef875a8f46bce3b9f59027941fba2a384118774355511535a8391"} Apr 18 02:58:52.572826 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:52.572567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:58:52.589802 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:58:52.589754 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" podStartSLOduration=10.4273752 podStartE2EDuration="10.5897401s" podCreationTimestamp="2026-04-18 02:58:42 +0000 UTC" firstStartedPulling="2026-04-18 02:58:51.566067535 +0000 UTC m=+791.072529788" lastFinishedPulling="2026-04-18 02:58:51.728432436 +0000 UTC m=+791.234894688" observedRunningTime="2026-04-18 02:58:52.588564912 +0000 UTC m=+792.095027185" watchObservedRunningTime="2026-04-18 02:58:52.5897401 +0000 UTC m=+792.096202375" Apr 18 02:59:03.588568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:03.588493 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-7js5w" Apr 18 02:59:04.408077 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.408044 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85f86c8bb7-mmjn2"] Apr 18 02:59:04.420351 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.420325 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85f86c8bb7-mmjn2"] Apr 18 02:59:04.420475 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.420452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.462899 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.462868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgtj\" (UniqueName: \"kubernetes.io/projected/39bd10d2-3a91-4a62-b379-dbd113cecd80-kube-api-access-4kgtj\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.463053 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.462915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/39bd10d2-3a91-4a62-b379-dbd113cecd80-tls-cert\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.563757 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.563728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgtj\" (UniqueName: \"kubernetes.io/projected/39bd10d2-3a91-4a62-b379-dbd113cecd80-kube-api-access-4kgtj\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.563894 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.563771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/39bd10d2-3a91-4a62-b379-dbd113cecd80-tls-cert\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.566164 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.566138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/39bd10d2-3a91-4a62-b379-dbd113cecd80-tls-cert\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.571007 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.570983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgtj\" (UniqueName: \"kubernetes.io/projected/39bd10d2-3a91-4a62-b379-dbd113cecd80-kube-api-access-4kgtj\") pod \"authorino-85f86c8bb7-mmjn2\" (UID: \"39bd10d2-3a91-4a62-b379-dbd113cecd80\") " pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.730568 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.730495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" Apr 18 02:59:04.845184 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:04.845159 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85f86c8bb7-mmjn2"] Apr 18 02:59:04.847101 ip-10-0-140-103 kubenswrapper[2575]: W0418 02:59:04.847073 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39bd10d2_3a91_4a62_b379_dbd113cecd80.slice/crio-f7040e05c8063fb276d1f9c2bad4010f8488c04c9a616aa84d6e3f0394601e06 WatchSource:0}: Error finding container f7040e05c8063fb276d1f9c2bad4010f8488c04c9a616aa84d6e3f0394601e06: Status 404 returned error can't find the container with id f7040e05c8063fb276d1f9c2bad4010f8488c04c9a616aa84d6e3f0394601e06 Apr 18 02:59:05.617274 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:05.617245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" event={"ID":"39bd10d2-3a91-4a62-b379-dbd113cecd80","Type":"ContainerStarted","Data":"f7040e05c8063fb276d1f9c2bad4010f8488c04c9a616aa84d6e3f0394601e06"} Apr 18 02:59:06.621379 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.621342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" event={"ID":"39bd10d2-3a91-4a62-b379-dbd113cecd80","Type":"ContainerStarted","Data":"85ccf2c80c7fc7bb9bb814b296cd5946d800e779adf2ce6ac7c86a8018c13ade"} Apr 18 02:59:06.635886 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.635838 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85f86c8bb7-mmjn2" podStartSLOduration=1.96713903 podStartE2EDuration="2.635824949s" podCreationTimestamp="2026-04-18 02:59:04 +0000 UTC" firstStartedPulling="2026-04-18 02:59:04.848412651 +0000 UTC m=+804.354874902" lastFinishedPulling="2026-04-18 02:59:05.517098568 +0000 UTC m=+805.023560821" observedRunningTime="2026-04-18 02:59:06.634904108 +0000 UTC m=+806.141366416" watchObservedRunningTime="2026-04-18 02:59:06.635824949 +0000 UTC m=+806.142287259" Apr 18 02:59:06.660062 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.660025 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:59:06.660286 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.660257 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7b7fb46454-4fr88" podUID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" containerName="authorino" containerID="cri-o://f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3" gracePeriod=30 Apr 18 02:59:06.933701 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.933676 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:59:06.988618 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.988579 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert\") pod \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " Apr 18 02:59:06.988795 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.988660 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474j5\" (UniqueName: \"kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5\") pod \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\" (UID: \"1bd41f15-d9ba-415b-b475-f1f3fe80795f\") " Apr 18 02:59:06.990574 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.990538 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5" (OuterVolumeSpecName: "kube-api-access-474j5") pod "1bd41f15-d9ba-415b-b475-f1f3fe80795f" (UID: "1bd41f15-d9ba-415b-b475-f1f3fe80795f"). InnerVolumeSpecName "kube-api-access-474j5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:59:06.998255 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:06.998230 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1bd41f15-d9ba-415b-b475-f1f3fe80795f" (UID: "1bd41f15-d9ba-415b-b475-f1f3fe80795f"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:59:07.089755 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.089718 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1bd41f15-d9ba-415b-b475-f1f3fe80795f-tls-cert\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:59:07.089755 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.089750 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-474j5\" (UniqueName: \"kubernetes.io/projected/1bd41f15-d9ba-415b-b475-f1f3fe80795f-kube-api-access-474j5\") on node \"ip-10-0-140-103.ec2.internal\" DevicePath \"\"" Apr 18 02:59:07.625137 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.625101 2575 generic.go:358] "Generic (PLEG): container finished" podID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" containerID="f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3" exitCode=0 Apr 18 02:59:07.625588 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.625157 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7fb46454-4fr88" Apr 18 02:59:07.625588 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.625190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7fb46454-4fr88" event={"ID":"1bd41f15-d9ba-415b-b475-f1f3fe80795f","Type":"ContainerDied","Data":"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3"} Apr 18 02:59:07.625588 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.625227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7fb46454-4fr88" event={"ID":"1bd41f15-d9ba-415b-b475-f1f3fe80795f","Type":"ContainerDied","Data":"059534f9bd91e61a7691094f5d1f604b0453d4e0e917d452c4ba9cebb9534506"} Apr 18 02:59:07.625588 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.625243 2575 scope.go:117] "RemoveContainer" containerID="f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3" Apr 18 02:59:07.633287 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.633271 2575 scope.go:117] "RemoveContainer" containerID="f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3" Apr 18 02:59:07.633569 ip-10-0-140-103 kubenswrapper[2575]: E0418 02:59:07.633551 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3\": container with ID starting with f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3 not found: ID does not exist" containerID="f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3" Apr 18 02:59:07.633617 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.633579 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3"} err="failed to get container status \"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3\": rpc error: code = NotFound desc = could not find container \"f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3\": container with ID starting with f10ecd8e7e026da9c6f0922242f9e9d1a49c81932f48fc3506368888662228f3 not found: ID does not exist" Apr 18 02:59:07.639626 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.639605 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:59:07.644295 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:07.644270 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7b7fb46454-4fr88"] Apr 18 02:59:09.176292 ip-10-0-140-103 kubenswrapper[2575]: I0418 02:59:09.176248 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" path="/var/lib/kubelet/pods/1bd41f15-d9ba-415b-b475-f1f3fe80795f/volumes" Apr 18 03:21:23.103463 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:23.103428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85f86c8bb7-mmjn2_39bd10d2-3a91-4a62-b379-dbd113cecd80/authorino/0.log" Apr 18 03:21:27.426777 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:27.426741 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-slrnc_1916a437-f847-43b2-8bce-a5201c06792c/manager/0.log" Apr 18 03:21:27.794995 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:27.794943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d8gqj_0bc7b60d-c168-441f-91fe-a611a5cab969/postgres/0.log" Apr 18 03:21:29.053133 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:29.053096 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85f86c8bb7-mmjn2_39bd10d2-3a91-4a62-b379-dbd113cecd80/authorino/0.log" Apr 18 03:21:29.383499 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:29.383427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6zg7v_d2fa858f-6f36-4082-8c99-5e5c7ae19770/kuadrant-console-plugin/0.log" Apr 18 03:21:30.523184 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:30.523081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f769b49f-lfmmq_6412bbcc-3769-49a0-9b29-2b79c0dfd386/kube-auth-proxy/0.log" Apr 18 03:21:31.209682 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.209641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls_fb3cd95c-286c-42d4-9720-4aa95005508d/storage-initializer/0.log" Apr 18 03:21:31.217736 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.217707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-hr2ls_fb3cd95c-286c-42d4-9720-4aa95005508d/main/0.log" Apr 18 03:21:31.571329 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.571296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th_eeebad60-9e1d-4213-9ea7-dd47ba121203/storage-initializer/0.log" Apr 18 03:21:31.578659 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.578617 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-7s7th_eeebad60-9e1d-4213-9ea7-dd47ba121203/main/0.log" Apr 18 03:21:31.690176 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.690143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-7js5w_2cd373b0-e827-4b53-b7b7-db83b5ffdeeb/storage-initializer/0.log" Apr 18 03:21:31.697683 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:31.697659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-7js5w_2cd373b0-e827-4b53-b7b7-db83b5ffdeeb/main/0.log" Apr 18 03:21:38.239347 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:38.239314 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-66knh_3797aadb-3c8b-4eda-9fbe-dce172b0710e/global-pull-secret-syncer/0.log" Apr 18 03:21:38.491420 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:38.491310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c22bt_681e4af6-1800-4683-b967-ed0bb5c1f635/konnectivity-agent/0.log" Apr 18 03:21:38.598406 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:38.598375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-103.ec2.internal_881cc9f87b49496f030449959b9b3b21/haproxy/0.log" Apr 18 03:21:43.077685 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:43.077650 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85f86c8bb7-mmjn2_39bd10d2-3a91-4a62-b379-dbd113cecd80/authorino/0.log" Apr 18 03:21:43.161323 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:43.161294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6zg7v_d2fa858f-6f36-4082-8c99-5e5c7ae19770/kuadrant-console-plugin/0.log" Apr 18 03:21:45.029248 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.029221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7ltnz_0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26/kube-state-metrics/0.log" Apr 18 03:21:45.046565 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.046543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7ltnz_0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26/kube-rbac-proxy-main/0.log" Apr 18 03:21:45.073150 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.073130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7ltnz_0338ba2e-ceb9-4c4d-a133-5f3eb8a5ba26/kube-rbac-proxy-self/0.log" Apr 18 03:21:45.135139 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.135112 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-65gtb_daf6871f-e9b0-4fbf-a098-7d7d62a2bf28/monitoring-plugin/0.log" Apr 18 03:21:45.282775 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.282645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqk4b_e821f22e-c172-40b8-95f5-11d1e334faae/node-exporter/0.log" Apr 18 03:21:45.322148 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.322104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqk4b_e821f22e-c172-40b8-95f5-11d1e334faae/kube-rbac-proxy/0.log" Apr 18 03:21:45.341142 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.341117 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqk4b_e821f22e-c172-40b8-95f5-11d1e334faae/init-textfile/0.log" Apr 18 03:21:45.509123 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.509090 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/prometheus/0.log" Apr 18 03:21:45.527430 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.527408 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/config-reloader/0.log" Apr 18 03:21:45.545941 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.545884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/thanos-sidecar/0.log" Apr 18 03:21:45.564914 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.564887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/kube-rbac-proxy-web/0.log" Apr 18 03:21:45.583872 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.583854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/kube-rbac-proxy/0.log" Apr 18 03:21:45.602032 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.602008 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/kube-rbac-proxy-thanos/0.log" Apr 18 03:21:45.621573 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:45.621557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c4340655-e704-4963-8cbc-e0834cc2dc62/init-config-reloader/0.log" Apr 18 03:21:46.533232 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.533190 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8"] Apr 18 03:21:46.533710 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.533646 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" containerName="authorino" Apr 18 03:21:46.533710 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.533668 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" containerName="authorino" Apr 18 03:21:46.533782 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.533765 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bd41f15-d9ba-415b-b475-f1f3fe80795f" containerName="authorino" Apr 18 03:21:46.536777 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.536756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.539215 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.539192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"openshift-service-ca.crt\"" Apr 18 03:21:46.540073 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.540053 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8bclg\"/\"default-dockercfg-mpz9c\"" Apr 18 03:21:46.540159 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.540069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"kube-root-ca.crt\"" Apr 18 03:21:46.546453 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.546427 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8"] Apr 18 03:21:46.581208 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.581184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-lib-modules\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.581308 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.581214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwps\" (UniqueName: \"kubernetes.io/projected/555a83ea-9156-4c9d-b38a-245c1ce045a9-kube-api-access-bqwps\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.581308 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.581237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-podres\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.581436 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.581410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-proc\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.581471 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.581455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-sys\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682099 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-proc\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682255 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-sys\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682255 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-lib-modules\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682255 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwps\" (UniqueName: \"kubernetes.io/projected/555a83ea-9156-4c9d-b38a-245c1ce045a9-kube-api-access-bqwps\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682255 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-proc\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682255 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-podres\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682501 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-lib-modules\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682501 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-sys\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.682501 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.682390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/555a83ea-9156-4c9d-b38a-245c1ce045a9-podres\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.689488 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.689464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwps\" (UniqueName: \"kubernetes.io/projected/555a83ea-9156-4c9d-b38a-245c1ce045a9-kube-api-access-bqwps\") pod \"perf-node-gather-daemonset-smxl8\" (UID: \"555a83ea-9156-4c9d-b38a-245c1ce045a9\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.848378 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.848286 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:46.965226 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.965140 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8"] Apr 18 03:21:46.967758 ip-10-0-140-103 kubenswrapper[2575]: W0418 03:21:46.967727 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod555a83ea_9156_4c9d_b38a_245c1ce045a9.slice/crio-9c5e789f2bb0c988b9be7d245a7fc7056c9cbd84b682c96cc7008f6a380148d7 WatchSource:0}: Error finding container 9c5e789f2bb0c988b9be7d245a7fc7056c9cbd84b682c96cc7008f6a380148d7: Status 404 returned error can't find the container with id 9c5e789f2bb0c988b9be7d245a7fc7056c9cbd84b682c96cc7008f6a380148d7 Apr 18 03:21:46.969323 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.969298 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 03:21:46.976587 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:46.976559 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-bfhrs_3c20ea1d-8a16-44d1-8cb9-0310ba00246c/networking-console-plugin/0.log" Apr 18 03:21:47.108221 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:47.108151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" event={"ID":"555a83ea-9156-4c9d-b38a-245c1ce045a9","Type":"ContainerStarted","Data":"51d7c06a51ae906272aac189394be1120dc7bbfb2fce307c64fc1c92b0538578"} Apr 18 03:21:47.108221 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:47.108187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" event={"ID":"555a83ea-9156-4c9d-b38a-245c1ce045a9","Type":"ContainerStarted","Data":"9c5e789f2bb0c988b9be7d245a7fc7056c9cbd84b682c96cc7008f6a380148d7"} Apr 18 03:21:47.108221 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:47.108214 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:47.122604 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:47.122556 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" podStartSLOduration=1.122542799 podStartE2EDuration="1.122542799s" podCreationTimestamp="2026-04-18 03:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 03:21:47.121248162 +0000 UTC m=+2166.627710437" watchObservedRunningTime="2026-04-18 03:21:47.122542799 +0000 UTC m=+2166.629005072" Apr 18 03:21:49.201777 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:49.201746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-svnzk_70a58891-387a-4f51-a0bf-e2abf38cf891/dns/0.log" Apr 18 03:21:49.220207 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:49.220184 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-svnzk_70a58891-387a-4f51-a0bf-e2abf38cf891/kube-rbac-proxy/0.log" Apr 18 03:21:49.240080 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:49.240059 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5nsl5_36d98916-52f7-469a-a004-8f19f9739057/dns-node-resolver/0.log" Apr 18 03:21:49.709524 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:49.709486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5cb9d77ddb-7p2zw_4e60772a-9186-4b5d-a11e-dd851e220d6f/registry/0.log" Apr 18 03:21:49.769139 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:49.769102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kbc68_a03893d3-b1b4-4eac-93ff-24aa692c02ec/node-ca/0.log" Apr 18 03:21:50.667671 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:50.667642 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f769b49f-lfmmq_6412bbcc-3769-49a0-9b29-2b79c0dfd386/kube-auth-proxy/0.log" Apr 18 03:21:51.278688 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:51.278658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-znc79_6e67776e-f785-425c-9c6b-56b4c10fccaa/serve-healthcheck-canary/0.log" Apr 18 03:21:51.826718 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:51.826689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qct7q_e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae/kube-rbac-proxy/0.log" Apr 18 03:21:51.844361 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:51.844332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qct7q_e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae/exporter/0.log" Apr 18 03:21:51.866204 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:51.866176 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qct7q_e31d5b95-e8ca-43c7-8ab9-bfe113cc6eae/extractor/0.log" Apr 18 03:21:53.121560 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:53.121533 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-smxl8" Apr 18 03:21:53.787464 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:53.787422 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-slrnc_1916a437-f847-43b2-8bce-a5201c06792c/manager/0.log" Apr 18 03:21:53.871138 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:53.871100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d8gqj_0bc7b60d-c168-441f-91fe-a611a5cab969/postgres/0.log" Apr 18 03:21:55.124388 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:55.124350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5dd789dc9-vhqp7_bde75da2-0078-424b-bc43-b20c14e7b952/manager/0.log" Apr 18 03:21:59.251564 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:59.251522 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9bz6g_a240b4fc-e94f-406b-a350-71efd88049dc/migrator/0.log" Apr 18 03:21:59.268263 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:59.268240 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9bz6g_a240b4fc-e94f-406b-a350-71efd88049dc/graceful-termination/0.log" Apr 18 03:21:59.612917 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:59.612884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-chlsj_b798241c-d3d5-4425-8f62-3533926b33a3/kube-storage-version-migrator-operator/1.log" Apr 18 03:21:59.614428 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:21:59.614403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-chlsj_b798241c-d3d5-4425-8f62-3533926b33a3/kube-storage-version-migrator-operator/0.log" Apr 18 03:22:00.930104 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:00.930066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/kube-multus-additional-cni-plugins/0.log" Apr 18 03:22:00.950569 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:00.950536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/egress-router-binary-copy/0.log" Apr 18 03:22:00.968638 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:00.968614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/cni-plugins/0.log" Apr 18 03:22:00.986266 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:00.986241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/bond-cni-plugin/0.log" Apr 18 03:22:01.003988 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.003962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/routeoverride-cni/0.log" Apr 18 03:22:01.022254 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.022228 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/whereabouts-cni-bincopy/0.log" Apr 18 03:22:01.039916 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.039889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcpsj_34e9beab-b2dc-42ba-a4ca-f08fa85d4f7b/whereabouts-cni/0.log" Apr 18 03:22:01.070708 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.070678 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ckptk_fb81bf64-9170-43bc-a130-f32b09124d60/kube-multus/0.log" Apr 18 03:22:01.197179 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.197106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l8m94_06d32427-f8ec-4151-bb49-8eaef8308f79/network-metrics-daemon/0.log" Apr 18 03:22:01.214259 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:01.214234 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l8m94_06d32427-f8ec-4151-bb49-8eaef8308f79/kube-rbac-proxy/0.log" Apr 18 03:22:02.598087 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.598030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/ovn-controller/0.log" Apr 18 03:22:02.632087 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.632058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/ovn-acl-logging/0.log" Apr 18 03:22:02.651812 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.651785 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/kube-rbac-proxy-node/0.log" Apr 18 03:22:02.671129 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.671108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/kube-rbac-proxy-ovn-metrics/0.log" Apr 18 03:22:02.687498 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.687470 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/northd/0.log" Apr 18 03:22:02.705234 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.705197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/nbdb/0.log" Apr 18 03:22:02.723516 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.723493 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/sbdb/0.log" Apr 18 03:22:02.883401 ip-10-0-140-103 kubenswrapper[2575]: I0418 03:22:02.883317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lsz2j_79a997b9-1bc8-4876-8383-936b5403b5e5/ovnkube-controller/0.log"