Apr 18 02:43:29.302596 ip-10-0-129-229 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 18 02:43:29.302607 ip-10-0-129-229 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 18 02:43:29.302614 ip-10-0-129-229 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 18 02:43:29.302864 ip-10-0-129-229 systemd[1]: Failed to start Kubernetes Kubelet. Apr 18 02:43:39.405414 ip-10-0-129-229 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 18 02:43:39.405429 ip-10-0-129-229 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 56e8568703944c8298e5bc39b1795172 -- Apr 18 02:46:02.150687 ip-10-0-129-229 systemd[1]: Starting Kubernetes Kubelet... Apr 18 02:46:02.630488 ip-10-0-129-229 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:46:02.630488 ip-10-0-129-229 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 18 02:46:02.630488 ip-10-0-129-229 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:46:02.630488 ip-10-0-129-229 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 18 02:46:02.630488 ip-10-0-129-229 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 18 02:46:02.631363 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.631258 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 18 02:46:02.633674 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633653 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:46:02.633674 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633671 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:46:02.633674 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633676 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:46:02.633674 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633682 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633688 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633701 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633706 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633711 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633715 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633719 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633723 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633727 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633732 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633736 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633740 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633744 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633748 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633752 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633764 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633769 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633773 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633776 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:46:02.633925 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633780 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633784 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633788 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633792 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633796 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633801 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633805 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633810 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633814 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633818 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633822 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633827 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633831 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633835 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633839 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633846 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633851 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633855 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633860 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633864 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:46:02.634784 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633868 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633876 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633882 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633887 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633891 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633895 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633900 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633905 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633909 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633913 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633918 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633922 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633927 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633931 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633936 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633940 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633944 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633950 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633954 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:46:02.635653 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633957 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633962 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633966 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633970 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633975 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633979 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633984 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633988 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633993 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.633997 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634002 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634006 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634011 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634015 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634020 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634025 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634029 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634033 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634037 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634041 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:46:02.636482 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634046 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634050 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634054 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634058 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634063 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634705 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634716 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634721 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634726 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634730 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634734 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634738 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634743 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634747 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634751 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634756 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634761 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634765 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634769 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634774 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:46:02.637107 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634778 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634782 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634787 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634791 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634795 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634799 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634804 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634808 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634812 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634816 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634821 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634825 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634830 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634834 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634839 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634843 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634847 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634851 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634855 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634860 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:46:02.637686 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634864 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634868 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634872 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634876 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634881 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634885 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634889 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634893 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634898 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634903 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634907 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634911 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634918 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634923 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634927 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634931 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634934 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634939 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634943 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:46:02.638501 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634949 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634954 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634958 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634963 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634967 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634971 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634979 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634986 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634991 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.634996 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635000 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635005 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635009 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635013 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635017 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635022 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635026 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635030 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635035 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:46:02.638978 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635039 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635043 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635047 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635051 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635055 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635061 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635066 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635070 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635076 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635082 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635087 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635094 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.635099 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635211 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635223 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635231 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635238 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635245 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635250 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635257 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635264 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 18 02:46:02.639574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635270 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635275 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635280 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635286 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635291 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635295 2574 flags.go:64] FLAG: --cgroup-root="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635317 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635322 2574 flags.go:64] FLAG: --client-ca-file="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635327 2574 flags.go:64] FLAG: --cloud-config="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635332 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635337 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635345 2574 flags.go:64] FLAG: --cluster-domain="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635351 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635356 2574 flags.go:64] FLAG: --config-dir="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635360 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635366 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635372 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635378 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635383 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635388 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635393 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635398 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635405 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635410 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635415 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 18 02:46:02.640099 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635422 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635429 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635434 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635438 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635443 2574 flags.go:64] FLAG: --enable-server="true" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635448 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635454 2574 flags.go:64] FLAG: --event-burst="100" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635459 2574 flags.go:64] FLAG: --event-qps="50" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635464 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635469 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635474 2574 flags.go:64] FLAG: --eviction-hard="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635480 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635485 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635490 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635494 2574 flags.go:64] FLAG: --eviction-soft="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635499 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635504 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635509 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635514 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635518 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635523 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635527 2574 flags.go:64] FLAG: --feature-gates="" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635534 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635539 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635545 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 18 02:46:02.640749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635550 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635556 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635561 2574 flags.go:64] FLAG: --help="false" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635566 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-129-229.ec2.internal" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635573 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635578 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635583 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635589 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635595 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635600 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635605 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635610 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635615 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635620 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635625 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635630 2574 flags.go:64] FLAG: --kube-reserved="" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635635 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635640 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635645 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635649 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635654 2574 flags.go:64] FLAG: --lock-file="" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635658 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635662 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635667 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 18 02:46:02.641373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635676 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635680 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635685 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635690 2574 flags.go:64] FLAG: --logging-format="text" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635695 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635701 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635706 2574 flags.go:64] FLAG: --manifest-url="" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635711 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635717 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635722 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635729 2574 flags.go:64] FLAG: --max-pods="110" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635735 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635745 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635749 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635754 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635759 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635764 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635769 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635783 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635788 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635793 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635798 2574 flags.go:64] FLAG: --pod-cidr="" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635803 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 18 02:46:02.641956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635812 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635817 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635822 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635827 2574 flags.go:64] FLAG: --port="10250" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635832 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635837 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01d1379d90e101e6c" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635842 2574 flags.go:64] FLAG: --qos-reserved="" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635847 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635852 2574 flags.go:64] FLAG: --register-node="true" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635856 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635861 2574 flags.go:64] FLAG: --register-with-taints="" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635870 2574 flags.go:64] FLAG: --registry-burst="10" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635875 2574 flags.go:64] FLAG: --registry-qps="5" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635880 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635884 2574 flags.go:64] FLAG: --reserved-memory="" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635890 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635896 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635901 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635905 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635910 2574 flags.go:64] FLAG: --runonce="false" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635915 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635922 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635927 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635932 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635937 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635942 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 18 02:46:02.642627 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635947 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635952 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635962 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635967 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635972 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635977 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635982 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635987 2574 flags.go:64] FLAG: --system-cgroups="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.635992 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636000 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636005 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636009 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636017 2574 flags.go:64] FLAG: --tls-min-version="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636022 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636027 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636031 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636036 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636043 2574 flags.go:64] FLAG: --v="2" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636050 2574 flags.go:64] FLAG: --version="false" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636057 2574 flags.go:64] FLAG: --vmodule="" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636064 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636070 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636242 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636249 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:46:02.643358 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636255 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636260 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636264 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636270 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636275 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636279 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636284 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636288 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636292 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636313 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636318 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636323 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636327 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636332 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636337 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636341 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636345 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636349 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636354 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:46:02.644000 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636358 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636362 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636367 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636371 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636375 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636380 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636386 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636391 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636395 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636399 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636403 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636407 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636412 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636416 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636420 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636426 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636433 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636437 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636442 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636446 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:46:02.644598 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636451 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636456 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636460 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636465 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636469 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636474 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636479 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636483 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636486 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636490 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636494 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636500 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636506 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636511 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636516 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636520 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636525 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636529 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636535 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636540 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:46:02.645160 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636544 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636549 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636553 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636557 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636561 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636566 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636570 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636575 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636581 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636585 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636589 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636593 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636597 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636602 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636607 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636611 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636615 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636619 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636629 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636633 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:46:02.645775 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636637 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636642 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636646 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636650 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.636655 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.636663 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.643650 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.643760 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643807 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643813 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643818 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643821 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643825 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643828 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643831 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:46:02.646335 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643834 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643836 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643839 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643842 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643845 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643847 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643850 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643853 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643856 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643858 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643861 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643863 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643866 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643868 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643871 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643873 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643876 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643879 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643881 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643884 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:46:02.646735 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643887 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643890 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643892 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643895 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643899 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643902 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643905 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643907 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643910 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643912 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643915 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643918 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643920 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643923 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643926 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643928 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643931 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643934 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643936 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643939 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:46:02.647234 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643942 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643945 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643948 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643950 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643952 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643955 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643958 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643960 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643963 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643966 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643969 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643971 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643976 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643980 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643983 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643986 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643989 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643992 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643995 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:46:02.647777 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.643998 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644000 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644003 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644006 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644008 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644011 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644013 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644016 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644019 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644022 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644024 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644027 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644029 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644032 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644034 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644037 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644040 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644042 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644045 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:46:02.648286 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644048 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.644053 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644153 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644158 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644161 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644164 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644168 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644171 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644174 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644177 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644180 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644182 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644186 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644188 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644192 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 18 02:46:02.648794 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644195 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644197 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644200 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644202 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644205 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644207 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644210 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644213 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644215 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644218 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644220 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644222 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644225 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644228 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644230 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644233 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644236 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644238 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644240 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644243 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 18 02:46:02.649178 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644245 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644248 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644250 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644253 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644256 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644259 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644261 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644264 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644266 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644269 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644272 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644275 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644277 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644281 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644284 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644287 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644290 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644294 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644313 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 18 02:46:02.649716 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644317 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644320 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644323 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644325 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644328 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644331 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644333 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644336 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644339 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644341 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644344 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644347 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644349 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644352 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644355 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644357 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644361 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644364 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644366 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644369 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 18 02:46:02.650172 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644371 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644374 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644376 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644379 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644383 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644385 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644388 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644390 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644393 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644395 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644398 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644400 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644403 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:02.644406 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.644411 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 18 02:46:02.650704 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.645184 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 18 02:46:02.651083 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.647490 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 18 02:46:02.651083 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.648550 2574 server.go:1019] "Starting client certificate rotation" Apr 18 02:46:02.651083 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.648645 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:46:02.651083 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.649130 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 18 02:46:02.674677 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.674653 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:46:02.677377 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.677341 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 18 02:46:02.694722 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.694698 2574 log.go:25] "Validated CRI v1 runtime API" Apr 18 02:46:02.700709 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.700690 2574 log.go:25] "Validated CRI v1 image API" Apr 18 02:46:02.702258 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.702236 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 18 02:46:02.702395 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.702378 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:46:02.709103 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.709077 2574 fs.go:135] Filesystem UUIDs: map[1ec23d99-0b4b-4d06-97ea-0866543f2dd2:/dev/nvme0n1p3 2f973111-850d-4b9d-9bc4-d346b9a3d6f8:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 18 02:46:02.709199 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.709102 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 18 02:46:02.714875 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.714756 2574 manager.go:217] Machine: {Timestamp:2026-04-18 02:46:02.712870385 +0000 UTC m=+0.436424271 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096721 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2298d596bb4b6918d20c4df81804df SystemUUID:ec2298d5-96bb-4b69-18d2-0c4df81804df BootID:56e85687-0394-4c82-98e5-bc39b1795172 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6c:a2:4b:36:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6c:a2:4b:36:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:cb:36:7d:13:50 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 18 02:46:02.714875 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.714869 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 18 02:46:02.715005 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.714994 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 18 02:46:02.716047 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716024 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 18 02:46:02.716188 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716049 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-229.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 18 02:46:02.716233 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716196 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 18 02:46:02.716233 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716206 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 18 02:46:02.716233 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716224 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:46:02.716361 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.716240 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 18 02:46:02.717137 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.717127 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:46:02.717237 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.717228 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 18 02:46:02.718124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.718104 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6nljs" Apr 18 02:46:02.719637 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.719627 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 18 02:46:02.719675 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.719642 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 18 02:46:02.719675 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.719654 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 18 02:46:02.719675 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.719664 2574 kubelet.go:397] "Adding apiserver pod source" Apr 18 02:46:02.719675 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.719672 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 18 02:46:02.720702 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.720687 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:46:02.720782 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.720706 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 18 02:46:02.723868 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.723836 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 18 02:46:02.724903 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.724842 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6nljs" Apr 18 02:46:02.725378 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.725363 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 18 02:46:02.727431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727406 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 18 02:46:02.727431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727430 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727437 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727442 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727448 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727455 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727461 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727466 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727481 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727489 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727499 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 18 02:46:02.727537 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.727511 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 18 02:46:02.729526 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.729514 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 18 02:46:02.729526 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.729527 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 18 02:46:02.735202 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.735078 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 18 02:46:02.735845 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.735812 2574 server.go:1295] "Started kubelet" Apr 18 02:46:02.736850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.736808 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 18 02:46:02.737106 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.737054 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 18 02:46:02.737180 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.737134 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 18 02:46:02.737353 ip-10-0-129-229 systemd[1]: Started Kubernetes Kubelet. Apr 18 02:46:02.737943 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.737889 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:02.738999 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.738969 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 18 02:46:02.739597 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.739580 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:02.742208 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.742187 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-229.ec2.internal" not found Apr 18 02:46:02.743054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.743038 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 18 02:46:02.745075 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.745060 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 18 02:46:02.745075 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.745070 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 18 02:46:02.747043 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.746964 2574 factory.go:55] Registering systemd factory Apr 18 02:46:02.747043 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.746990 2574 factory.go:223] Registration of the systemd container factory successfully Apr 18 02:46:02.747367 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747345 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:02.747444 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747434 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 18 02:46:02.747444 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747438 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 18 02:46:02.747538 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747457 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 18 02:46:02.747587 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747580 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 18 02:46:02.747641 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747589 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 18 02:46:02.747689 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747679 2574 factory.go:153] Registering CRI-O factory Apr 18 02:46:02.747689 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747689 2574 factory.go:223] Registration of the crio container factory successfully Apr 18 02:46:02.747779 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747754 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 18 02:46:02.747779 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.747760 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-229.ec2.internal\" not found" Apr 18 02:46:02.747872 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747784 2574 factory.go:103] Registering Raw factory Apr 18 02:46:02.747872 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.747799 2574 manager.go:1196] Started watching for new ooms in manager Apr 18 02:46:02.748491 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.748470 2574 manager.go:319] Starting recovery of all containers Apr 18 02:46:02.750562 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.750540 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-229.ec2.internal\" not found" node="ip-10-0-129-229.ec2.internal" Apr 18 02:46:02.752037 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.752011 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 18 02:46:02.759130 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.759111 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-229.ec2.internal" not found Apr 18 02:46:02.759292 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.759278 2574 manager.go:324] Recovery completed Apr 18 02:46:02.760564 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.760524 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 18 02:46:02.763523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.763509 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:46:02.767750 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.767731 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:46:02.767817 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.767771 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:46:02.767817 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.767790 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:46:02.768466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.768451 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 18 02:46:02.768528 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.768474 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 18 02:46:02.768528 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.768491 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 18 02:46:02.771031 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.771019 2574 policy_none.go:49] "None policy: Start" Apr 18 02:46:02.771068 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.771035 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 18 02:46:02.771068 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.771046 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.810875 2574 manager.go:341] "Starting Device Plugin manager" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.810936 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.810946 2574 server.go:85] "Starting device plugin registration server" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.811204 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.811214 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.811338 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.811419 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.811427 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.812165 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.812200 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-229.ec2.internal\" not found" Apr 18 02:46:02.814269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.814097 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-229.ec2.internal" not found Apr 18 02:46:02.875853 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.875816 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 18 02:46:02.876924 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.876908 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 18 02:46:02.877039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.876931 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 18 02:46:02.877039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.876954 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 18 02:46:02.877039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.876960 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 18 02:46:02.877166 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:02.877044 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 18 02:46:02.879159 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.879140 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:02.911998 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.911906 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 18 02:46:02.913085 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.913058 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasSufficientMemory" Apr 18 02:46:02.913206 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.913094 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 18 02:46:02.913206 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.913111 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeHasSufficientPID" Apr 18 02:46:02.913206 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.913141 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-229.ec2.internal" Apr 18 02:46:02.922194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.922174 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-229.ec2.internal" Apr 18 02:46:02.977790 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.977746 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal"] Apr 18 02:46:02.980251 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.980228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:02.980251 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:02.980242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.007103 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.007076 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.011510 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.011495 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.020422 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.020400 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:46:03.025791 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.025776 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 18 02:46:03.049155 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.049126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.049263 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.049163 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.049263 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.049189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1d43c470c7e05ac28f68f1360471b9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-229.ec2.internal\" (UID: \"1d43c470c7e05ac28f68f1360471b9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149784 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149784 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1d43c470c7e05ac28f68f1360471b9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-229.ec2.internal\" (UID: \"1d43c470c7e05ac28f68f1360471b9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4032b4d7aff1379a9c14d5fa7d3dea91-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal\" (UID: \"4032b4d7aff1379a9c14d5fa7d3dea91\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.149944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.149868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1d43c470c7e05ac28f68f1360471b9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-229.ec2.internal\" (UID: \"1d43c470c7e05ac28f68f1360471b9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.323821 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.323740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.328326 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.328310 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" Apr 18 02:46:03.648483 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.648371 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 18 02:46:03.649339 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.648588 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:46:03.649339 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.648600 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:46:03.649339 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.648590 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 18 02:46:03.720470 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.720430 2574 apiserver.go:52] "Watching apiserver" Apr 18 02:46:03.726901 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.726871 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-17 02:41:02 +0000 UTC" deadline="2028-01-08 02:08:54.312199308 +0000 UTC" Apr 18 02:46:03.726901 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.726898 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15119h22m50.585304448s" Apr 18 02:46:03.729338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.729320 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 18 02:46:03.729645 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.729625 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-c6w8h","openshift-network-operator/iptables-alerter-5s7gh","openshift-ovn-kubernetes/ovnkube-node-k8fsv","kube-system/konnectivity-agent-nctzt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk","openshift-cluster-node-tuning-operator/tuned-bgzxp","openshift-image-registry/node-ca-4wx6b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal","openshift-multus/multus-additional-cni-plugins-d4xn6","openshift-network-diagnostics/network-check-target-bjkvs","kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal","openshift-multus/multus-dbmjj"] Apr 18 02:46:03.732654 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.732641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.732733 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.732701 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:03.732786 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.732751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.734314 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.734288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.734971 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.734956 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 18 02:46:03.735580 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.735559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lv478\"" Apr 18 02:46:03.735665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.735555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.735721 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.735701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.736367 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.735566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.736476 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.736447 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 18 02:46:03.737495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 18 02:46:03.737495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 18 02:46:03.737495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737468 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 18 02:46:03.737794 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737714 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.737794 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737757 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.738088 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.738006 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.738163 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.737760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqqp8\"" Apr 18 02:46:03.738674 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.738658 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gsrlj\"" Apr 18 02:46:03.738773 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.738736 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 18 02:46:03.738829 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.738799 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 18 02:46:03.739088 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.739070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.740198 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.740175 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.740833 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.740794 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.741042 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741027 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.741176 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741155 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6mdf7\"" Apr 18 02:46:03.741256 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741201 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 18 02:46:03.741380 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:03.741380 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.741346 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:03.741580 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741565 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.741719 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741705 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.741775 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.741708 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqngg\"" Apr 18 02:46:03.742104 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.742088 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.742253 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.742239 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 18 02:46:03.742334 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.742270 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.742527 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.742512 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wff7q\"" Apr 18 02:46:03.742599 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.742582 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.743797 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.743780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.745188 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745170 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 18 02:46:03.745524 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745511 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 18 02:46:03.745690 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745677 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 18 02:46:03.745761 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745689 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qb6fj\"" Apr 18 02:46:03.745819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 18 02:46:03.745819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.745775 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 18 02:46:03.746115 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.746100 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 18 02:46:03.746187 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.746106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-klbqc\"" Apr 18 02:46:03.746241 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.746184 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 18 02:46:03.748195 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.748181 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 18 02:46:03.752665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-netns\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.752775 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.752775 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jknc\" (UniqueName: \"kubernetes.io/projected/eaf422fa-fd33-491a-b182-991116468c18-kube-api-access-6jknc\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.752775 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-var-lib-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.752775 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-ovn\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df7f443e-28b4-49bc-ad27-c0360b16827c-serviceca\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-sys\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-lib-modules\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752883 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-host\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.752920 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-systemd-units\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-env-overrides\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.752975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-script-lib\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-device-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-conf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-host-slash\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-netd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbb2j\" (UniqueName: \"kubernetes.io/projected/df7f443e-28b4-49bc-ad27-c0360b16827c-kube-api-access-bbb2j\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-system-cni-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq64z\" (UniqueName: \"kubernetes.io/projected/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-kube-api-access-pq64z\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-kubelet\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-systemd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753364 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-etc-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-modprobe-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-system-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-bin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-slash\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-agent-certs\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-konnectivity-ca\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-var-lib-kubelet\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-node-log\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-multus-daemon-config\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vlwd\" (UniqueName: \"kubernetes.io/projected/d159849f-3b4d-45b7-8f49-9f9f11d96088-kube-api-access-5vlwd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-systemd\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753825 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-cnibin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-k8s-cni-cncf-io\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.753979 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cnibin\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbmrh\" (UniqueName: \"kubernetes.io/projected/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-kube-api-access-sbmrh\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.753991 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-sys-fs\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-etc-kubernetes\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-log-socket\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp7rf\" (UniqueName: \"kubernetes.io/projected/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-kube-api-access-qp7rf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4j6g\" (UniqueName: \"kubernetes.io/projected/841058db-3583-4d11-853d-8a1d444e8ea6-kube-api-access-v4j6g\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-iptables-alerter-script\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-config\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovn-node-metrics-cert\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-tmp\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-hostroot\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-conf-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-os-release\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-bin\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df7f443e-28b4-49bc-ad27-c0360b16827c-host\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.754549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-multus\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-netns\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-registration-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-socket-dir-parent\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-socket-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwbnn\" (UniqueName: \"kubernetes.io/projected/dd5c6619-29f0-4dd3-913e-6532510e87b4-kube-api-access-vwbnn\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-kubernetes\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-tuned\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754668 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-kubelet\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-multus-certs\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysconfig\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-run\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-os-release\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754762 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 18 02:46:03.755142 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.754768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-cni-binary-copy\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.774496 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.774476 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tp4m2" Apr 18 02:46:03.780847 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.780822 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tp4m2" Apr 18 02:46:03.855177 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-tuned\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.855346 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-kubelet\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855346 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-multus-certs\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855346 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysconfig\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.855346 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-kubelet\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855346 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-multus-certs\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysconfig\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-run\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-run\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-os-release\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-cni-binary-copy\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-netns\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-os-release\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855481 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jknc\" (UniqueName: \"kubernetes.io/projected/eaf422fa-fd33-491a-b182-991116468c18-kube-api-access-6jknc\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-var-lib-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-ovn\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855581 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df7f443e-28b4-49bc-ad27-c0360b16827c-serviceca\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.855595 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-sys\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-lib-modules\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-host\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-systemd-units\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-env-overrides\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-script-lib\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-device-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-conf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-host-slash\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-netd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbb2j\" (UniqueName: \"kubernetes.io/projected/df7f443e-28b4-49bc-ad27-c0360b16827c-kube-api-access-bbb2j\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-system-cni-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855902 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq64z\" (UniqueName: \"kubernetes.io/projected/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-kube-api-access-pq64z\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.856273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-kubelet\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-systemd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-etc-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-modprobe-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.855983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-system-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-bin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-slash\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-agent-certs\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-konnectivity-ca\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-var-lib-kubelet\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-env-overrides\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-cni-binary-copy\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-netns\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-systemd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-kubelet\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-node-log\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-var-lib-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-node-log\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856261 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-system-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-cni-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-etc-selinux\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856318 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-modprobe-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-ovn\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-slash\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-multus-daemon-config\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-conf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-etc-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vlwd\" (UniqueName: \"kubernetes.io/projected/d159849f-3b4d-45b7-8f49-9f9f11d96088-kube-api-access-5vlwd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-systemd\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-cnibin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.857935 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-k8s-cni-cncf-io\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cnibin\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbmrh\" (UniqueName: \"kubernetes.io/projected/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-kube-api-access-sbmrh\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-device-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-sys-fs\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-etc-kubernetes\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856701 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df7f443e-28b4-49bc-ad27-c0360b16827c-serviceca\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-log-socket\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-host-slash\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp7rf\" (UniqueName: \"kubernetes.io/projected/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-kube-api-access-qp7rf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-system-cni-dir\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4j6g\" (UniqueName: \"kubernetes.io/projected/841058db-3583-4d11-853d-8a1d444e8ea6-kube-api-access-v4j6g\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-bin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-host\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-iptables-alerter-script\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-config\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.858809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/841058db-3583-4d11-853d-8a1d444e8ea6-multus-daemon-config\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovn-node-metrics-cert\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-tmp\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-hostroot\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-conf-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-script-lib\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-os-release\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-bin\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-run-k8s-cni-cncf-io\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df7f443e-28b4-49bc-ad27-c0360b16827c-host\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-multus\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-netns\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-registration-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-socket-dir-parent\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-socket-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.859596 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.856888 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-lib-modules\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwbnn\" (UniqueName: \"kubernetes.io/projected/dd5c6619-29f0-4dd3-913e-6532510e87b4-kube-api-access-vwbnn\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-kubernetes\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-kubernetes\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-systemd-units\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-sysctl-d\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-iptables-alerter-script\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-sys-fs\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cnibin\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-systemd\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-cnibin\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.857135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-sys\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-etc-tuned\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-var-lib-kubelet\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-netd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-run-netns\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-log-socket\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-hostroot\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-host-cni-bin\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovnkube-config\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-conf-dir\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-os-release\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-etc-kubernetes\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.858778 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.858866 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:04.358828136 +0000 UTC m=+2.082382029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-konnectivity-ca\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df7f443e-28b4-49bc-ad27-c0360b16827c-host\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.858981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-host-var-lib-cni-multus\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d159849f-3b4d-45b7-8f49-9f9f11d96088-run-openvswitch\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859041 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-registration-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2517d1b-5d5f-4341-a8d1-b4646105d5ba-agent-certs\") pod \"konnectivity-agent-nctzt\" (UID: \"c2517d1b-5d5f-4341-a8d1-b4646105d5ba\") " pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:03.860617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/841058db-3583-4d11-853d-8a1d444e8ea6-multus-socket-dir-parent\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.861097 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859186 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c6619-29f0-4dd3-913e-6532510e87b4-socket-dir\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.861097 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859558 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.861097 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d159849f-3b4d-45b7-8f49-9f9f11d96088-ovn-node-metrics-cert\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.861097 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.859995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-tmp\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.861928 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.861906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.862977 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.862850 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:03.862977 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.862875 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:03.862977 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.862889 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:03.862977 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:03.862952 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:04.362936135 +0000 UTC m=+2.086490009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:03.863880 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.863853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jknc\" (UniqueName: \"kubernetes.io/projected/eaf422fa-fd33-491a-b182-991116468c18-kube-api-access-6jknc\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:03.864196 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.864176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq64z\" (UniqueName: \"kubernetes.io/projected/f8ff51d7-b5a5-4955-a5f0-fcb644339ab3-kube-api-access-pq64z\") pod \"iptables-alerter-5s7gh\" (UID: \"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3\") " pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:03.864373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.864355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4j6g\" (UniqueName: \"kubernetes.io/projected/841058db-3583-4d11-853d-8a1d444e8ea6-kube-api-access-v4j6g\") pod \"multus-dbmjj\" (UID: \"841058db-3583-4d11-853d-8a1d444e8ea6\") " pod="openshift-multus/multus-dbmjj" Apr 18 02:46:03.864915 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.864896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp7rf\" (UniqueName: \"kubernetes.io/projected/aaaab92e-a9b5-4b30-bd97-ffa98fd1a904-kube-api-access-qp7rf\") pod \"tuned-bgzxp\" (UID: \"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904\") " pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:03.865604 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.865583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbmrh\" (UniqueName: \"kubernetes.io/projected/ba7fee3b-25c5-45b5-93bd-fe87ba08395f-kube-api-access-sbmrh\") pod \"multus-additional-cni-plugins-d4xn6\" (UID: \"ba7fee3b-25c5-45b5-93bd-fe87ba08395f\") " pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.866904 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.866880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbb2j\" (UniqueName: \"kubernetes.io/projected/df7f443e-28b4-49bc-ad27-c0360b16827c-kube-api-access-bbb2j\") pod \"node-ca-4wx6b\" (UID: \"df7f443e-28b4-49bc-ad27-c0360b16827c\") " pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:03.867160 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.867142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwbnn\" (UniqueName: \"kubernetes.io/projected/dd5c6619-29f0-4dd3-913e-6532510e87b4-kube-api-access-vwbnn\") pod \"aws-ebs-csi-driver-node-bh9hk\" (UID: \"dd5c6619-29f0-4dd3-913e-6532510e87b4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:03.868750 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.868726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vlwd\" (UniqueName: \"kubernetes.io/projected/d159849f-3b4d-45b7-8f49-9f9f11d96088-kube-api-access-5vlwd\") pod \"ovnkube-node-k8fsv\" (UID: \"d159849f-3b4d-45b7-8f49-9f9f11d96088\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:03.882227 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.882210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" Apr 18 02:46:03.922943 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:03.922721 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d43c470c7e05ac28f68f1360471b9c3.slice/crio-fe09b1b59517b08ef30bd2c6672cbdb706dd8dff66f75f24ae4bc3bb81f992ea WatchSource:0}: Error finding container fe09b1b59517b08ef30bd2c6672cbdb706dd8dff66f75f24ae4bc3bb81f992ea: Status 404 returned error can't find the container with id fe09b1b59517b08ef30bd2c6672cbdb706dd8dff66f75f24ae4bc3bb81f992ea Apr 18 02:46:03.923428 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:03.923394 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4032b4d7aff1379a9c14d5fa7d3dea91.slice/crio-50b280f00e10fee208688a568ee4a179eb7998dbb35608645008ab87f707bbcb WatchSource:0}: Error finding container 50b280f00e10fee208688a568ee4a179eb7998dbb35608645008ab87f707bbcb: Status 404 returned error can't find the container with id 50b280f00e10fee208688a568ee4a179eb7998dbb35608645008ab87f707bbcb Apr 18 02:46:03.924216 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:03.924197 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7fee3b_25c5_45b5_93bd_fe87ba08395f.slice/crio-5a16e4deaf1e7d053de2874a461c34a9153816c45f3b2c96044042abf64d9aef WatchSource:0}: Error finding container 5a16e4deaf1e7d053de2874a461c34a9153816c45f3b2c96044042abf64d9aef: Status 404 returned error can't find the container with id 5a16e4deaf1e7d053de2874a461c34a9153816c45f3b2c96044042abf64d9aef Apr 18 02:46:03.927707 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:03.927686 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:46:04.065183 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.065154 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5s7gh" Apr 18 02:46:04.070931 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.070898 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ff51d7_b5a5_4955_a5f0_fcb644339ab3.slice/crio-bd9864d72b7da31a6a9ece9bfc70f486c17b5d63978dacbe3a8f58553f2e284a WatchSource:0}: Error finding container bd9864d72b7da31a6a9ece9bfc70f486c17b5d63978dacbe3a8f58553f2e284a: Status 404 returned error can't find the container with id bd9864d72b7da31a6a9ece9bfc70f486c17b5d63978dacbe3a8f58553f2e284a Apr 18 02:46:04.083396 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.083376 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:04.090755 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.090726 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd159849f_3b4d_45b7_8f49_9f9f11d96088.slice/crio-04354628fd6c7e900aea6d56b0d27c66dc9e0e32e9ff4837d8b2dcd40e25e0f6 WatchSource:0}: Error finding container 04354628fd6c7e900aea6d56b0d27c66dc9e0e32e9ff4837d8b2dcd40e25e0f6: Status 404 returned error can't find the container with id 04354628fd6c7e900aea6d56b0d27c66dc9e0e32e9ff4837d8b2dcd40e25e0f6 Apr 18 02:46:04.108495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.108473 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:04.113967 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.113947 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2517d1b_5d5f_4341_a8d1_b4646105d5ba.slice/crio-d2424eae9f5f9c7b021a03df3fb1c90f800e394e07299a445ca2801c1c3d5dd5 WatchSource:0}: Error finding container d2424eae9f5f9c7b021a03df3fb1c90f800e394e07299a445ca2801c1c3d5dd5: Status 404 returned error can't find the container with id d2424eae9f5f9c7b021a03df3fb1c90f800e394e07299a445ca2801c1c3d5dd5 Apr 18 02:46:04.119249 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.119231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" Apr 18 02:46:04.125901 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.125879 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5c6619_29f0_4dd3_913e_6532510e87b4.slice/crio-ac7c5a39dafc4b917eb9fde7a63c17c448b9f7e7925eecf975ef9ed39a951a90 WatchSource:0}: Error finding container ac7c5a39dafc4b917eb9fde7a63c17c448b9f7e7925eecf975ef9ed39a951a90: Status 404 returned error can't find the container with id ac7c5a39dafc4b917eb9fde7a63c17c448b9f7e7925eecf975ef9ed39a951a90 Apr 18 02:46:04.131947 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.131930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" Apr 18 02:46:04.136782 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.136765 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaaab92e_a9b5_4b30_bd97_ffa98fd1a904.slice/crio-a214c34a80783d5b54d64ebd3ae7ecffa69779b9fc45476bc7645d62c2ac2e34 WatchSource:0}: Error finding container a214c34a80783d5b54d64ebd3ae7ecffa69779b9fc45476bc7645d62c2ac2e34: Status 404 returned error can't find the container with id a214c34a80783d5b54d64ebd3ae7ecffa69779b9fc45476bc7645d62c2ac2e34 Apr 18 02:46:04.145667 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.145652 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wx6b" Apr 18 02:46:04.150965 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.150945 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7f443e_28b4_49bc_ad27_c0360b16827c.slice/crio-bb85ecc38358171a634a274e050fc7331c0e297493f54a874b85af7f3405d982 WatchSource:0}: Error finding container bb85ecc38358171a634a274e050fc7331c0e297493f54a874b85af7f3405d982: Status 404 returned error can't find the container with id bb85ecc38358171a634a274e050fc7331c0e297493f54a874b85af7f3405d982 Apr 18 02:46:04.160399 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.160382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dbmjj" Apr 18 02:46:04.166104 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:04.166083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841058db_3583_4d11_853d_8a1d444e8ea6.slice/crio-8cbcfaf1a6fff17a4fdd27eb48d1e4dc45d673cc2e46f44d569d9efa4ac6c3fa WatchSource:0}: Error finding container 8cbcfaf1a6fff17a4fdd27eb48d1e4dc45d673cc2e46f44d569d9efa4ac6c3fa: Status 404 returned error can't find the container with id 8cbcfaf1a6fff17a4fdd27eb48d1e4dc45d673cc2e46f44d569d9efa4ac6c3fa Apr 18 02:46:04.360632 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.360587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:04.360796 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.360739 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:04.360853 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.360804 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:05.360784905 +0000 UTC m=+3.084338781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:04.462495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.461787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:04.462495 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.461969 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:04.462495 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.461987 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:04.462495 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.462000 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:04.462495 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:04.462060 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:05.462040713 +0000 UTC m=+3.185594609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:04.472677 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.472646 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:04.781913 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.781779 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:41:03 +0000 UTC" deadline="2027-11-08 10:19:40.109259738 +0000 UTC" Apr 18 02:46:04.781913 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.781813 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13663h33m35.327450528s" Apr 18 02:46:04.889175 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.888414 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jrk2r"] Apr 18 02:46:04.892081 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.891326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:04.895362 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.894459 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g2fpl\"" Apr 18 02:46:04.895362 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.894701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 18 02:46:04.904183 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.903940 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 18 02:46:04.906837 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.905903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerStarted","Data":"5a16e4deaf1e7d053de2874a461c34a9153816c45f3b2c96044042abf64d9aef"} Apr 18 02:46:04.912340 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.912235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dbmjj" event={"ID":"841058db-3583-4d11-853d-8a1d444e8ea6","Type":"ContainerStarted","Data":"8cbcfaf1a6fff17a4fdd27eb48d1e4dc45d673cc2e46f44d569d9efa4ac6c3fa"} Apr 18 02:46:04.922956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.922917 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" event={"ID":"dd5c6619-29f0-4dd3-913e-6532510e87b4","Type":"ContainerStarted","Data":"ac7c5a39dafc4b917eb9fde7a63c17c448b9f7e7925eecf975ef9ed39a951a90"} Apr 18 02:46:04.936506 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.936466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5s7gh" event={"ID":"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3","Type":"ContainerStarted","Data":"bd9864d72b7da31a6a9ece9bfc70f486c17b5d63978dacbe3a8f58553f2e284a"} Apr 18 02:46:04.940207 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.940155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" event={"ID":"1d43c470c7e05ac28f68f1360471b9c3","Type":"ContainerStarted","Data":"fe09b1b59517b08ef30bd2c6672cbdb706dd8dff66f75f24ae4bc3bb81f992ea"} Apr 18 02:46:04.945493 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.945470 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:04.948652 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.948547 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wx6b" event={"ID":"df7f443e-28b4-49bc-ad27-c0360b16827c","Type":"ContainerStarted","Data":"bb85ecc38358171a634a274e050fc7331c0e297493f54a874b85af7f3405d982"} Apr 18 02:46:04.959907 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.959877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" event={"ID":"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904","Type":"ContainerStarted","Data":"a214c34a80783d5b54d64ebd3ae7ecffa69779b9fc45476bc7645d62c2ac2e34"} Apr 18 02:46:04.966813 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.966595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flr98\" (UniqueName: \"kubernetes.io/projected/8976a474-462c-4893-ac54-7572b4e92f46-kube-api-access-flr98\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:04.966813 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.966639 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8976a474-462c-4893-ac54-7572b4e92f46-hosts-file\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:04.966813 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.966728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8976a474-462c-4893-ac54-7572b4e92f46-tmp-dir\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:04.969711 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.969630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nctzt" event={"ID":"c2517d1b-5d5f-4341-a8d1-b4646105d5ba","Type":"ContainerStarted","Data":"d2424eae9f5f9c7b021a03df3fb1c90f800e394e07299a445ca2801c1c3d5dd5"} Apr 18 02:46:04.973070 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.973029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"04354628fd6c7e900aea6d56b0d27c66dc9e0e32e9ff4837d8b2dcd40e25e0f6"} Apr 18 02:46:04.988432 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.988361 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" event={"ID":"4032b4d7aff1379a9c14d5fa7d3dea91","Type":"ContainerStarted","Data":"50b280f00e10fee208688a568ee4a179eb7998dbb35608645008ab87f707bbcb"} Apr 18 02:46:04.992033 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:04.992009 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 18 02:46:05.067370 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.067215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flr98\" (UniqueName: \"kubernetes.io/projected/8976a474-462c-4893-ac54-7572b4e92f46-kube-api-access-flr98\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.067370 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.067264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8976a474-462c-4893-ac54-7572b4e92f46-hosts-file\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.067370 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.067296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8976a474-462c-4893-ac54-7572b4e92f46-tmp-dir\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.067842 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.067754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8976a474-462c-4893-ac54-7572b4e92f46-hosts-file\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.069217 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.068993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8976a474-462c-4893-ac54-7572b4e92f46-tmp-dir\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.091120 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.091085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flr98\" (UniqueName: \"kubernetes.io/projected/8976a474-462c-4893-ac54-7572b4e92f46-kube-api-access-flr98\") pod \"node-resolver-jrk2r\" (UID: \"8976a474-462c-4893-ac54-7572b4e92f46\") " pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.211261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.211224 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jrk2r" Apr 18 02:46:05.369972 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.369879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:05.370138 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.370061 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:05.370138 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.370124 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:07.370106524 +0000 UTC m=+5.093660397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:05.471356 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.471318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:05.471519 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.471478 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:05.471519 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.471497 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:05.471519 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.471509 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:05.471740 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.471567 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:07.471548242 +0000 UTC m=+5.195102132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:05.782539 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.782453 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-17 02:41:03 +0000 UTC" deadline="2027-12-15 03:48:09.787047874 +0000 UTC" Apr 18 02:46:05.782539 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.782491 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14545h2m4.004560279s" Apr 18 02:46:05.877624 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.877566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:05.877815 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.877700 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:05.878140 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:05.878111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:05.878239 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:05.878216 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:06.000836 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:06.000737 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jrk2r" event={"ID":"8976a474-462c-4893-ac54-7572b4e92f46","Type":"ContainerStarted","Data":"63d2ef245ec24ef3d43bc1809860bcf0948b1debfd217f4f7beab95aea4d0ae8"} Apr 18 02:46:07.386677 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:07.386613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:07.387141 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.386799 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:07.387141 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.386866 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:11.386847193 +0000 UTC m=+9.110401072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:07.487282 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:07.487245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:07.487486 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.487438 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:07.487486 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.487464 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:07.487486 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.487478 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:07.487672 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.487540 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:11.487520669 +0000 UTC m=+9.211074546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:07.878098 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:07.878062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:07.878289 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.878217 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:07.878501 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:07.878351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:07.878501 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:07.878463 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:09.877945 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:09.877908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:09.878494 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:09.878044 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:09.878494 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:09.878131 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:09.878494 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:09.878232 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:11.417734 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:11.417704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:11.418195 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.417870 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:11.418195 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.417942 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:19.417922361 +0000 UTC m=+17.141476234 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:11.518602 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:11.518531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:11.518790 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.518727 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:11.518790 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.518749 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:11.518790 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.518759 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:11.518948 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.518816 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:19.51880003 +0000 UTC m=+17.242353904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:11.877800 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:11.877760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:11.877935 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.877895 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:11.877996 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:11.877947 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:11.878087 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:11.878064 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:13.878121 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:13.878046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:13.878503 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:13.878046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:13.878503 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:13.878182 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:13.878503 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:13.878218 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:15.878173 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:15.878139 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:15.878637 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:15.878269 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:15.878637 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:15.878145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:15.878637 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:15.878395 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:17.877795 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:17.877755 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:17.878237 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:17.877772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:17.878237 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:17.877874 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:17.878237 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:17.877965 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:19.478243 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:19.478196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:19.478719 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.478364 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:19.478719 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.478442 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:35.478420871 +0000 UTC m=+33.201974745 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:19.579502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:19.579466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:19.579717 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.579617 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:19.579717 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.579637 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:19.579717 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.579650 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:19.579717 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.579714 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:35.579696091 +0000 UTC m=+33.303249990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:19.878059 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:19.877979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:19.878217 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:19.877983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:19.878217 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.878094 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:19.878217 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:19.878201 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:21.877432 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:21.877402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:21.877816 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:21.877408 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:21.877816 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:21.877517 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:21.877816 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:21.877571 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:23.032357 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.031968 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="23cf0592819c0db15e9a35695e16fe80211de5d473ace94a22b1f06bb8a853cf" exitCode=0 Apr 18 02:46:23.033190 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.032055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"23cf0592819c0db15e9a35695e16fe80211de5d473ace94a22b1f06bb8a853cf"} Apr 18 02:46:23.033973 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.033698 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dbmjj" event={"ID":"841058db-3583-4d11-853d-8a1d444e8ea6","Type":"ContainerStarted","Data":"2ab2e6353fa642cf0515b27e75f205df78410ec6095846bec937580fe22523b4"} Apr 18 02:46:23.034889 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.034872 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" event={"ID":"dd5c6619-29f0-4dd3-913e-6532510e87b4","Type":"ContainerStarted","Data":"bc0ccc5c92c99636655079f5e085d9cbc5aaa382f650c389d8720b0206fb8312"} Apr 18 02:46:23.035992 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.035963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" event={"ID":"1d43c470c7e05ac28f68f1360471b9c3","Type":"ContainerStarted","Data":"7550547f179e6ad6364470b1423d3efb62cf514bda1b10ee206c6d392e8f10ed"} Apr 18 02:46:23.037079 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.037059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jrk2r" event={"ID":"8976a474-462c-4893-ac54-7572b4e92f46","Type":"ContainerStarted","Data":"703e88873702c3f353e4038b3f314e230012abd3552c261b49f3dae76f93372e"} Apr 18 02:46:23.038186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.038166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wx6b" event={"ID":"df7f443e-28b4-49bc-ad27-c0360b16827c","Type":"ContainerStarted","Data":"83719004258fd0765f6b4b3b14ab04797ae99335ae6519e2dd6dfe16973a125f"} Apr 18 02:46:23.039682 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.039662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" event={"ID":"aaaab92e-a9b5-4b30-bd97-ffa98fd1a904","Type":"ContainerStarted","Data":"5f5e8a641b3d84261f7dcabe2b97a2e1132840891a924196171700e52ba60c83"} Apr 18 02:46:23.040747 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.040728 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nctzt" event={"ID":"c2517d1b-5d5f-4341-a8d1-b4646105d5ba","Type":"ContainerStarted","Data":"0eb06e345ca7117ddde46496ab091148ae77b0f342d8b81086d95707896607ee"} Apr 18 02:46:23.042622 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042605 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:46:23.042884 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042868 2574 generic.go:358] "Generic (PLEG): container finished" podID="d159849f-3b4d-45b7-8f49-9f9f11d96088" containerID="3332c20012bf0eba82ac82341eef4252083f0279dfffcb0430560a0ac8efed58" exitCode=1 Apr 18 02:46:23.042939 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042915 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"7bd6c79ea0413399729f6fb1865d44f3c0fde64c254a6f6ef3911d1aa47d7d31"} Apr 18 02:46:23.042939 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"f6e94a50b5b01fe97df67d3491a7ba41b956f53cbc16ae8f78f155c4aed4bb80"} Apr 18 02:46:23.043039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"f75e16f5634d6216d986d3cafca517bf02f20bb7e59bfa6110484e75de60e3a5"} Apr 18 02:46:23.043039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"4ce8f046aecb908d55182cdd1aeb05033378f1810b1affa3adb412f76c45d9e7"} Apr 18 02:46:23.043039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerDied","Data":"3332c20012bf0eba82ac82341eef4252083f0279dfffcb0430560a0ac8efed58"} Apr 18 02:46:23.043039 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.042971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"1e19a28c0cbcb09b5b5eff5bcd458e59f001a66e615177dddd12b99e0ee336f6"} Apr 18 02:46:23.044068 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.044050 2574 generic.go:358] "Generic (PLEG): container finished" podID="4032b4d7aff1379a9c14d5fa7d3dea91" containerID="c53cdcf9c298e4af3afc2579afac16ad6e535900edaa191d0af396165cf75960" exitCode=0 Apr 18 02:46:23.044145 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.044076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" event={"ID":"4032b4d7aff1379a9c14d5fa7d3dea91","Type":"ContainerDied","Data":"c53cdcf9c298e4af3afc2579afac16ad6e535900edaa191d0af396165cf75960"} Apr 18 02:46:23.066799 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.066738 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4wx6b" podStartSLOduration=3.363218813 podStartE2EDuration="21.06672017s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.152262687 +0000 UTC m=+1.875816560" lastFinishedPulling="2026-04-18 02:46:21.855764029 +0000 UTC m=+19.579317917" observedRunningTime="2026-04-18 02:46:23.066577329 +0000 UTC m=+20.790131223" watchObservedRunningTime="2026-04-18 02:46:23.06672017 +0000 UTC m=+20.790274066" Apr 18 02:46:23.087933 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.087878 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dbmjj" podStartSLOduration=3.21501679 podStartE2EDuration="21.087860582s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.168009201 +0000 UTC m=+1.891563074" lastFinishedPulling="2026-04-18 02:46:22.040852991 +0000 UTC m=+19.764406866" observedRunningTime="2026-04-18 02:46:23.087497774 +0000 UTC m=+20.811051670" watchObservedRunningTime="2026-04-18 02:46:23.087860582 +0000 UTC m=+20.811414477" Apr 18 02:46:23.140849 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.140798 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nctzt" podStartSLOduration=8.111125871 podStartE2EDuration="21.140779974s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.115183532 +0000 UTC m=+1.838737405" lastFinishedPulling="2026-04-18 02:46:17.144837618 +0000 UTC m=+14.868391508" observedRunningTime="2026-04-18 02:46:23.106700504 +0000 UTC m=+20.830254400" watchObservedRunningTime="2026-04-18 02:46:23.140779974 +0000 UTC m=+20.864333874" Apr 18 02:46:23.141092 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.140970 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bgzxp" podStartSLOduration=3.423400381 podStartE2EDuration="21.140960643s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.138172538 +0000 UTC m=+1.861726411" lastFinishedPulling="2026-04-18 02:46:21.855732789 +0000 UTC m=+19.579286673" observedRunningTime="2026-04-18 02:46:23.140625858 +0000 UTC m=+20.864179754" watchObservedRunningTime="2026-04-18 02:46:23.140960643 +0000 UTC m=+20.864514749" Apr 18 02:46:23.160705 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.160663 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jrk2r" podStartSLOduration=2.5316496109999997 podStartE2EDuration="19.160650076s" podCreationTimestamp="2026-04-18 02:46:04 +0000 UTC" firstStartedPulling="2026-04-18 02:46:05.25405212 +0000 UTC m=+2.977605999" lastFinishedPulling="2026-04-18 02:46:21.883052591 +0000 UTC m=+19.606606464" observedRunningTime="2026-04-18 02:46:23.160295632 +0000 UTC m=+20.883849527" watchObservedRunningTime="2026-04-18 02:46:23.160650076 +0000 UTC m=+20.884203971" Apr 18 02:46:23.178541 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.178501 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-229.ec2.internal" podStartSLOduration=20.178486392 podStartE2EDuration="20.178486392s" podCreationTimestamp="2026-04-18 02:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:23.17821728 +0000 UTC m=+20.901771175" watchObservedRunningTime="2026-04-18 02:46:23.178486392 +0000 UTC m=+20.902040286" Apr 18 02:46:23.620757 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.620529 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 18 02:46:23.821748 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.821538 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-18T02:46:23.620755494Z","UUID":"1be21d9f-4dac-4c14-9379-b787c136c450","Handler":null,"Name":"","Endpoint":""} Apr 18 02:46:23.824735 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.824695 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 18 02:46:23.824735 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.824731 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 18 02:46:23.877369 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.877333 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:23.877531 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:23.877333 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:23.877531 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:23.877467 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:23.877531 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:23.877518 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:24.047699 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.047656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" event={"ID":"dd5c6619-29f0-4dd3-913e-6532510e87b4","Type":"ContainerStarted","Data":"ad3ddf441da12c382561f6b17ee2b5806caf976ff96c2ccc0a4df701f433749c"} Apr 18 02:46:24.048804 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.048780 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5s7gh" event={"ID":"f8ff51d7-b5a5-4955-a5f0-fcb644339ab3","Type":"ContainerStarted","Data":"a85ecc023b375093574db74a6b7e6a280af9e458e4b65f3b251f4e5d41c150bd"} Apr 18 02:46:24.050423 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.050401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" event={"ID":"4032b4d7aff1379a9c14d5fa7d3dea91","Type":"ContainerStarted","Data":"f7f2be0f16aa4e1bfab3370672b9392abdc04ced6ef6225ad4b7df76c5f47719"} Apr 18 02:46:24.062539 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.062503 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5s7gh" podStartSLOduration=4.279119464 podStartE2EDuration="22.062488172s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.072386148 +0000 UTC m=+1.795940021" lastFinishedPulling="2026-04-18 02:46:21.855754853 +0000 UTC m=+19.579308729" observedRunningTime="2026-04-18 02:46:24.061937037 +0000 UTC m=+21.785490933" watchObservedRunningTime="2026-04-18 02:46:24.062488172 +0000 UTC m=+21.786042069" Apr 18 02:46:24.074860 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.074781 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-229.ec2.internal" podStartSLOduration=21.074766513 podStartE2EDuration="21.074766513s" podCreationTimestamp="2026-04-18 02:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:46:24.074670488 +0000 UTC m=+21.798224383" watchObservedRunningTime="2026-04-18 02:46:24.074766513 +0000 UTC m=+21.798320408" Apr 18 02:46:24.734984 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.734952 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:24.735669 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:24.735649 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:25.055965 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.055898 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:46:25.056649 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.056270 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"587041db2856cac3650b3b89460a4c681d880a96fb8c956bd8c270601bfed4a6"} Apr 18 02:46:25.058445 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.058415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" event={"ID":"dd5c6619-29f0-4dd3-913e-6532510e87b4","Type":"ContainerStarted","Data":"8c3114e9efce3cf6aed8473c82aeec7ccfb611ecd6c0cf7a9408c3f05f57b7fc"} Apr 18 02:46:25.059121 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.059082 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:25.059600 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.059583 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nctzt" Apr 18 02:46:25.074077 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.074029 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bh9hk" podStartSLOduration=2.723982485 podStartE2EDuration="23.074013631s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.12753515 +0000 UTC m=+1.851089023" lastFinishedPulling="2026-04-18 02:46:24.477566283 +0000 UTC m=+22.201120169" observedRunningTime="2026-04-18 02:46:25.073755597 +0000 UTC m=+22.797309493" watchObservedRunningTime="2026-04-18 02:46:25.074013631 +0000 UTC m=+22.797567527" Apr 18 02:46:25.878185 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.878145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:25.878397 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:25.878150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:25.878397 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:25.878268 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:25.878397 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:25.878387 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:27.877512 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:27.877481 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:27.877512 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:27.877507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:27.878279 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:27.877602 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:27.878279 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:27.877728 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:28.068579 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:28.068424 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:46:29.073808 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.073779 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:46:29.074296 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.074113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"211f014439af8dcae63097931baf39f56f9e9c2acbbe6ac7915044ef9e9b8c88"} Apr 18 02:46:29.074449 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.074430 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:29.074764 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.074741 2574 scope.go:117] "RemoveContainer" containerID="3332c20012bf0eba82ac82341eef4252083f0279dfffcb0430560a0ac8efed58" Apr 18 02:46:29.075985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.075825 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="c1fb2f806001d1a321313a9c7b4b40644f06155c7eb1eae4798e32b1c8b45cd7" exitCode=0 Apr 18 02:46:29.075985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.075863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"c1fb2f806001d1a321313a9c7b4b40644f06155c7eb1eae4798e32b1c8b45cd7"} Apr 18 02:46:29.089369 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.089350 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:29.877907 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.877696 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:29.878067 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:29.877778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:29.878067 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:29.878033 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:29.878166 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:29.878109 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:30.009130 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.009100 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bjkvs"] Apr 18 02:46:30.011881 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.011859 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6w8h"] Apr 18 02:46:30.080774 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.080704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.081092 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.081092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" event={"ID":"d159849f-3b4d-45b7-8f49-9f9f11d96088","Type":"ContainerStarted","Data":"f34904d5dfef9512ac589fa12afc90ae1183b8b224eb967d6a4028ac591c5150"} Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.081203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:30.081213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.081238 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:30.081338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.081273 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:30.081585 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:30.081381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:30.095193 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.095171 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:46:30.107069 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:30.107020 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" podStartSLOduration=10.275091468 podStartE2EDuration="28.10700791s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:46:04.092157684 +0000 UTC m=+1.815711557" lastFinishedPulling="2026-04-18 02:46:21.924074125 +0000 UTC m=+19.647627999" observedRunningTime="2026-04-18 02:46:30.105336263 +0000 UTC m=+27.828890159" watchObservedRunningTime="2026-04-18 02:46:30.10700791 +0000 UTC m=+27.830561805" Apr 18 02:46:31.087604 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:31.087572 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="728b95f2f2afa5a7445254eb3d75d037a419b1e376a17515d8b4b262c426035f" exitCode=0 Apr 18 02:46:31.088090 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:31.087646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"728b95f2f2afa5a7445254eb3d75d037a419b1e376a17515d8b4b262c426035f"} Apr 18 02:46:31.877196 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:31.877163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:31.877396 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:31.877163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:31.877396 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:31.877327 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:31.877396 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:31.877377 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:32.091386 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:32.091350 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="9dbadec30e0412e9953234d6043f43844674ce3c30155381c6f80596d8e0ccec" exitCode=0 Apr 18 02:46:32.091768 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:32.091438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"9dbadec30e0412e9953234d6043f43844674ce3c30155381c6f80596d8e0ccec"} Apr 18 02:46:33.877740 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:33.877708 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:33.878168 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:33.877836 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjkvs" podUID="02ff8c43-768b-49b2-90a6-30f839c12ea3" Apr 18 02:46:33.878168 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:33.877901 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:33.878168 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:33.878035 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:46:34.051085 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.051054 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-229.ec2.internal" event="NodeReady" Apr 18 02:46:34.051276 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.051224 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 18 02:46:34.093592 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.093558 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jmr6g"] Apr 18 02:46:34.098371 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.098339 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-59c6n"] Apr 18 02:46:34.098518 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.098489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.101174 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.101107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 18 02:46:34.101174 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.101148 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:46:34.101174 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.101161 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 18 02:46:34.101617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.101599 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.104377 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.104152 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 18 02:46:34.104377 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.104187 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:46:34.104558 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.104437 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 18 02:46:34.104637 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.104623 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 18 02:46:34.105467 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.105426 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jmr6g"] Apr 18 02:46:34.107604 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.107577 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-59c6n"] Apr 18 02:46:34.193650 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.193650 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf5032f8-0827-4cc5-8381-d39ca8db84ee-tmp-dir\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.193899 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.193899 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lp68\" (UniqueName: \"kubernetes.io/projected/3be6e6c6-5134-4428-888c-4efe46336918-kube-api-access-6lp68\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.193899 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69h2r\" (UniqueName: \"kubernetes.io/projected/cf5032f8-0827-4cc5-8381-d39ca8db84ee-kube-api-access-69h2r\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.193899 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.193869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf5032f8-0827-4cc5-8381-d39ca8db84ee-config-volume\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.294956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.294915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lp68\" (UniqueName: \"kubernetes.io/projected/3be6e6c6-5134-4428-888c-4efe46336918-kube-api-access-6lp68\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.294956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.294960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69h2r\" (UniqueName: \"kubernetes.io/projected/cf5032f8-0827-4cc5-8381-d39ca8db84ee-kube-api-access-69h2r\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.295186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.294999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf5032f8-0827-4cc5-8381-d39ca8db84ee-config-volume\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.295186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.295036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.295186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.295055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf5032f8-0827-4cc5-8381-d39ca8db84ee-tmp-dir\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.295186 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.295142 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:34.295444 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.295201 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:46:34.795181799 +0000 UTC m=+32.518735672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:34.295444 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.295240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.295444 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.295369 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:34.295444 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.295416 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:34.795402538 +0000 UTC m=+32.518956424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:34.295669 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.295561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf5032f8-0827-4cc5-8381-d39ca8db84ee-tmp-dir\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.295723 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.295671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf5032f8-0827-4cc5-8381-d39ca8db84ee-config-volume\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.306681 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.306553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69h2r\" (UniqueName: \"kubernetes.io/projected/cf5032f8-0827-4cc5-8381-d39ca8db84ee-kube-api-access-69h2r\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.306681 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.306657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lp68\" (UniqueName: \"kubernetes.io/projected/3be6e6c6-5134-4428-888c-4efe46336918-kube-api-access-6lp68\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.799038 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.798992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:34.799038 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:34.799075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:34.799553 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.799165 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:34.799553 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.799204 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:34.799553 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.799248 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:35.799222976 +0000 UTC m=+33.522776861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:34.799553 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:34.799267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:46:35.799258192 +0000 UTC m=+33.522812070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:35.503852 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.503811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:35.504292 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.503957 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:35.504292 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.504026 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:07.504008177 +0000 UTC m=+65.227562054 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 18 02:46:35.605525 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.605477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:35.605736 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.605718 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 18 02:46:35.605794 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.605745 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 18 02:46:35.605794 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.605788 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9ksh6 for pod openshift-network-diagnostics/network-check-target-bjkvs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:35.605891 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.605856 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6 podName:02ff8c43-768b-49b2-90a6-30f839c12ea3 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:07.605829657 +0000 UTC m=+65.329383533 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ksh6" (UniqueName: "kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6") pod "network-check-target-bjkvs" (UID: "02ff8c43-768b-49b2-90a6-30f839c12ea3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 18 02:46:35.807354 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.807256 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:35.807354 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.807353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:35.807543 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.807419 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:35.807543 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.807452 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:35.807543 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.807491 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:37.807474601 +0000 UTC m=+35.531028474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:35.807543 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:35.807506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:46:37.807498779 +0000 UTC m=+35.531052654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:35.877871 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.877833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:46:35.878052 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.877833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:46:35.882152 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.882128 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 18 02:46:35.882293 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.882150 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:46:35.882659 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.882478 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 18 02:46:35.882659 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.882514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nn98\"" Apr 18 02:46:35.882659 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:35.882555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 18 02:46:37.824007 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:37.823983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:37.824385 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:37.824033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:37.824385 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:37.824137 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:37.824385 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:37.824150 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:37.824385 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:37.824197 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:46:41.824182542 +0000 UTC m=+39.547736419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:37.824385 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:37.824211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:41.824205671 +0000 UTC m=+39.547759544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:38.104281 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:38.104251 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="1f5657e21736384bc412582c16f97231e0c995af2368ecd013630b4561457841" exitCode=0 Apr 18 02:46:38.104456 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:38.104315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"1f5657e21736384bc412582c16f97231e0c995af2368ecd013630b4561457841"} Apr 18 02:46:39.108587 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:39.108556 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba7fee3b-25c5-45b5-93bd-fe87ba08395f" containerID="43ec8eacbc880e08c372fdd66716494782eddeb178fedf4163f0c59097c9c71a" exitCode=0 Apr 18 02:46:39.109009 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:39.108612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerDied","Data":"43ec8eacbc880e08c372fdd66716494782eddeb178fedf4163f0c59097c9c71a"} Apr 18 02:46:40.113342 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:40.113136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" event={"ID":"ba7fee3b-25c5-45b5-93bd-fe87ba08395f","Type":"ContainerStarted","Data":"8c5d6a1b327e5314322739e25df78f59002c0f7c3fd4fc4d47b58394b72458a3"} Apr 18 02:46:40.135318 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:40.135250 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d4xn6" podStartSLOduration=3.326537233 podStartE2EDuration="37.135237018s" podCreationTimestamp="2026-04-18 02:46:03 +0000 UTC" firstStartedPulling="2026-04-18 02:46:03.928076292 +0000 UTC m=+1.651630168" lastFinishedPulling="2026-04-18 02:46:37.736776076 +0000 UTC m=+35.460329953" observedRunningTime="2026-04-18 02:46:40.133561629 +0000 UTC m=+37.857115525" watchObservedRunningTime="2026-04-18 02:46:40.135237018 +0000 UTC m=+37.858790912" Apr 18 02:46:41.854982 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:41.854947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:41.855409 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:41.855006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:41.855409 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:41.855115 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:41.855409 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:41.855172 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:46:49.855159093 +0000 UTC m=+47.578712966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:41.855409 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:41.855115 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:41.855409 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:41.855245 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:46:49.855231966 +0000 UTC m=+47.578785852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:49.345786 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.345750 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp"] Apr 18 02:46:49.387054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.387023 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp"] Apr 18 02:46:49.387054 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.387051 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6"] Apr 18 02:46:49.387235 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.387154 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.389896 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.389876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 18 02:46:49.390019 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.389938 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-s97bt\"" Apr 18 02:46:49.390071 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.390022 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 18 02:46:49.391341 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.391325 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 18 02:46:49.391401 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.391383 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 18 02:46:49.404715 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.404688 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6"] Apr 18 02:46:49.404838 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.404801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.407248 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.407231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 18 02:46:49.407551 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.407534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 18 02:46:49.407620 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.407559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 18 02:46:49.407620 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.407534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 18 02:46:49.512187 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512388 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512388 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqlmz\" (UniqueName: \"kubernetes.io/projected/b4022571-563d-4009-8aa3-b53018b1f875-kube-api-access-bqlmz\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.512388 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512388 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2vq\" (UniqueName: \"kubernetes.io/projected/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-kube-api-access-cw2vq\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512548 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512548 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.512548 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.512428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b4022571-563d-4009-8aa3-b53018b1f875-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.613573 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.613573 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.613573 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b4022571-563d-4009-8aa3-b53018b1f875-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.613852 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.613852 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.613852 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqlmz\" (UniqueName: \"kubernetes.io/projected/b4022571-563d-4009-8aa3-b53018b1f875-kube-api-access-bqlmz\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.613989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.614037 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.613990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2vq\" (UniqueName: \"kubernetes.io/projected/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-kube-api-access-cw2vq\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.614676 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.614647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.617173 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.617149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-ca\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.617272 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.617209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.617351 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.617335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.617351 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.617340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b4022571-563d-4009-8aa3-b53018b1f875-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.617422 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.617340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.621062 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.621035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqlmz\" (UniqueName: \"kubernetes.io/projected/b4022571-563d-4009-8aa3-b53018b1f875-kube-api-access-bqlmz\") pod \"managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp\" (UID: \"b4022571-563d-4009-8aa3-b53018b1f875\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.621169 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.621155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2vq\" (UniqueName: \"kubernetes.io/projected/fa6d6c53-f01d-4cd5-9500-77ed9a842e54-kube-api-access-cw2vq\") pod \"cluster-proxy-proxy-agent-55477c477-tsqd6\" (UID: \"fa6d6c53-f01d-4cd5-9500-77ed9a842e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.707105 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.707066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" Apr 18 02:46:49.713975 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.713949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:46:49.868182 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.868099 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6"] Apr 18 02:46:49.871207 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:49.871178 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa6d6c53_f01d_4cd5_9500_77ed9a842e54.slice/crio-d6944c965366c14d0760c07b98eea358798ab249a0e8807726a5a3889d2854d9 WatchSource:0}: Error finding container d6944c965366c14d0760c07b98eea358798ab249a0e8807726a5a3889d2854d9: Status 404 returned error can't find the container with id d6944c965366c14d0760c07b98eea358798ab249a0e8807726a5a3889d2854d9 Apr 18 02:46:49.881605 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.881581 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp"] Apr 18 02:46:49.884460 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:46:49.884435 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4022571_563d_4009_8aa3_b53018b1f875.slice/crio-a346425a0bc835588349c65959417cc54e2d1f61a166108d246e59a6c96cbeca WatchSource:0}: Error finding container a346425a0bc835588349c65959417cc54e2d1f61a166108d246e59a6c96cbeca: Status 404 returned error can't find the container with id a346425a0bc835588349c65959417cc54e2d1f61a166108d246e59a6c96cbeca Apr 18 02:46:49.916545 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.916511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:46:49.916700 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:49.916565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:46:49.916700 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:49.916650 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:46:49.916700 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:49.916654 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:46:49.916810 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:49.916701 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:47:05.916687013 +0000 UTC m=+63.640240886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:46:49.916810 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:46:49.916714 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:05.916708818 +0000 UTC m=+63.640262691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:46:50.131319 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:50.131210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerStarted","Data":"d6944c965366c14d0760c07b98eea358798ab249a0e8807726a5a3889d2854d9"} Apr 18 02:46:50.132149 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:50.132118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" event={"ID":"b4022571-563d-4009-8aa3-b53018b1f875","Type":"ContainerStarted","Data":"a346425a0bc835588349c65959417cc54e2d1f61a166108d246e59a6c96cbeca"} Apr 18 02:46:54.142881 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:54.142842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerStarted","Data":"ec04487dc7f0424a9f415f6cf3afb748a4be11493ee6843498b1eaabdbffa620"} Apr 18 02:46:54.143953 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:54.143931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" event={"ID":"b4022571-563d-4009-8aa3-b53018b1f875","Type":"ContainerStarted","Data":"87eb5b256f36e362a81bdd1c7721c55c500581dc9f185d8c6c48150379ce51db"} Apr 18 02:46:54.158417 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:54.158367 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" podStartSLOduration=1.5315859509999998 podStartE2EDuration="5.158353779s" podCreationTimestamp="2026-04-18 02:46:49 +0000 UTC" firstStartedPulling="2026-04-18 02:46:49.886247604 +0000 UTC m=+47.609801477" lastFinishedPulling="2026-04-18 02:46:53.513015429 +0000 UTC m=+51.236569305" observedRunningTime="2026-04-18 02:46:54.157632069 +0000 UTC m=+51.881185964" watchObservedRunningTime="2026-04-18 02:46:54.158353779 +0000 UTC m=+51.881907673" Apr 18 02:46:56.148673 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:56.148589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerStarted","Data":"a4e435830195848828b6ee246e8e7544cc266b9cf0be9df8857aaa486fea7fc0"} Apr 18 02:46:56.148673 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:56.148627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerStarted","Data":"d12f7c172077f521716f6ace2951ff2139bcf7db9bdc969c39c4f0604fef26e0"} Apr 18 02:46:56.165225 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:46:56.165179 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" podStartSLOduration=1.192450809 podStartE2EDuration="7.165165673s" podCreationTimestamp="2026-04-18 02:46:49 +0000 UTC" firstStartedPulling="2026-04-18 02:46:49.873169229 +0000 UTC m=+47.596723103" lastFinishedPulling="2026-04-18 02:46:55.845884091 +0000 UTC m=+53.569437967" observedRunningTime="2026-04-18 02:46:56.164939663 +0000 UTC m=+53.888493558" watchObservedRunningTime="2026-04-18 02:46:56.165165673 +0000 UTC m=+53.888719567" Apr 18 02:47:02.101801 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:02.101772 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8fsv" Apr 18 02:47:05.932541 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:05.932487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:47:05.932954 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:05.932561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:47:05.932954 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:05.932655 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:47:05.932954 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:05.932656 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:47:05.932954 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:05.932705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:47:37.932692687 +0000 UTC m=+95.656246560 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:47:05.932954 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:05.932720 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:47:37.932712252 +0000 UTC m=+95.656266126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:47:07.542531 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.542480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:47:07.545677 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.545658 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 18 02:47:07.552800 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:07.552777 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:47:07.552893 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:07.552847 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:11.552825122 +0000 UTC m=+129.276378994 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : secret "metrics-daemon-secret" not found Apr 18 02:47:07.643520 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.643481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:47:07.646144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.646123 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 18 02:47:07.657148 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.657121 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 18 02:47:07.667665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.667638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ksh6\" (UniqueName: \"kubernetes.io/projected/02ff8c43-768b-49b2-90a6-30f839c12ea3-kube-api-access-9ksh6\") pod \"network-check-target-bjkvs\" (UID: \"02ff8c43-768b-49b2-90a6-30f839c12ea3\") " pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:47:07.698070 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.698036 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nn98\"" Apr 18 02:47:07.706184 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.706160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:47:07.843540 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:07.843507 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bjkvs"] Apr 18 02:47:07.847760 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:47:07.847733 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ff8c43_768b_49b2_90a6_30f839c12ea3.slice/crio-cce7680be530b9570fd71e24724c5bbfacdbeac8a3e8e3f5d2172e49d4e2e8dd WatchSource:0}: Error finding container cce7680be530b9570fd71e24724c5bbfacdbeac8a3e8e3f5d2172e49d4e2e8dd: Status 404 returned error can't find the container with id cce7680be530b9570fd71e24724c5bbfacdbeac8a3e8e3f5d2172e49d4e2e8dd Apr 18 02:47:08.175388 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:08.175286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bjkvs" event={"ID":"02ff8c43-768b-49b2-90a6-30f839c12ea3","Type":"ContainerStarted","Data":"cce7680be530b9570fd71e24724c5bbfacdbeac8a3e8e3f5d2172e49d4e2e8dd"} Apr 18 02:47:11.186332 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:11.186278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bjkvs" event={"ID":"02ff8c43-768b-49b2-90a6-30f839c12ea3","Type":"ContainerStarted","Data":"a19904e8be0824d626019f2363d92319a38171e481fd078dff9baf1461c97afb"} Apr 18 02:47:11.186713 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:11.186459 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:47:11.201653 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:11.201603 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bjkvs" podStartSLOduration=66.643140796 podStartE2EDuration="1m9.201589284s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:47:07.8494919 +0000 UTC m=+65.573045777" lastFinishedPulling="2026-04-18 02:47:10.407940379 +0000 UTC m=+68.131494265" observedRunningTime="2026-04-18 02:47:11.201126951 +0000 UTC m=+68.924680846" watchObservedRunningTime="2026-04-18 02:47:11.201589284 +0000 UTC m=+68.925143180" Apr 18 02:47:37.945900 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:37.945763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:47:37.945900 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:37.945858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:47:37.946472 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:37.945905 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 18 02:47:37.946472 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:37.945985 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert podName:3be6e6c6-5134-4428-888c-4efe46336918 nodeName:}" failed. No retries permitted until 2026-04-18 02:48:41.945966751 +0000 UTC m=+159.669520624 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert") pod "ingress-canary-59c6n" (UID: "3be6e6c6-5134-4428-888c-4efe46336918") : secret "canary-serving-cert" not found Apr 18 02:47:37.946472 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:37.946042 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 18 02:47:37.946472 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:47:37.946120 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls podName:cf5032f8-0827-4cc5-8381-d39ca8db84ee nodeName:}" failed. No retries permitted until 2026-04-18 02:48:41.946100348 +0000 UTC m=+159.669654221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls") pod "dns-default-jmr6g" (UID: "cf5032f8-0827-4cc5-8381-d39ca8db84ee") : secret "dns-default-metrics-tls" not found Apr 18 02:47:42.191872 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:47:42.191841 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bjkvs" Apr 18 02:48:11.581621 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:11.581568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:48:11.582114 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:11.581713 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 18 02:48:11.582114 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:11.581789 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs podName:eaf422fa-fd33-491a-b182-991116468c18 nodeName:}" failed. No retries permitted until 2026-04-18 02:50:13.581771696 +0000 UTC m=+251.305325569 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs") pod "network-metrics-daemon-c6w8h" (UID: "eaf422fa-fd33-491a-b182-991116468c18") : secret "metrics-daemon-secret" not found Apr 18 02:48:18.979823 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:18.979793 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jrk2r_8976a474-462c-4893-ac54-7572b4e92f46/dns-node-resolver/0.log" Apr 18 02:48:19.780431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:19.780403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wx6b_df7f443e-28b4-49bc-ad27-c0360b16827c/node-ca/0.log" Apr 18 02:48:37.112487 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:37.112435 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jmr6g" podUID="cf5032f8-0827-4cc5-8381-d39ca8db84ee" Apr 18 02:48:37.119543 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:37.119510 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-59c6n" podUID="3be6e6c6-5134-4428-888c-4efe46336918" Apr 18 02:48:37.378865 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:37.378772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:48:37.379037 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:37.378778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:38.889600 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:38.889555 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-c6w8h" podUID="eaf422fa-fd33-491a-b182-991116468c18" Apr 18 02:48:42.003051 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.002942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:48:42.003051 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.003018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:42.005386 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.005359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf5032f8-0827-4cc5-8381-d39ca8db84ee-metrics-tls\") pod \"dns-default-jmr6g\" (UID: \"cf5032f8-0827-4cc5-8381-d39ca8db84ee\") " pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:42.005517 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.005464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be6e6c6-5134-4428-888c-4efe46336918-cert\") pod \"ingress-canary-59c6n\" (UID: \"3be6e6c6-5134-4428-888c-4efe46336918\") " pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:48:42.182751 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.182718 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwvkt\"" Apr 18 02:48:42.183871 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.183853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-djfrv\"" Apr 18 02:48:42.189940 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.189917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:42.189940 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.189920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-59c6n" Apr 18 02:48:42.310601 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.310570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jmr6g"] Apr 18 02:48:42.313886 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:48:42.313841 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5032f8_0827_4cc5_8381_d39ca8db84ee.slice/crio-556d0cade21baa256c5ac8db359c6f261db1c3661ae6ae40c253c0a0bedc7daf WatchSource:0}: Error finding container 556d0cade21baa256c5ac8db359c6f261db1c3661ae6ae40c253c0a0bedc7daf: Status 404 returned error can't find the container with id 556d0cade21baa256c5ac8db359c6f261db1c3661ae6ae40c253c0a0bedc7daf Apr 18 02:48:42.325181 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.325154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-59c6n"] Apr 18 02:48:42.328193 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:48:42.328169 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3be6e6c6_5134_4428_888c_4efe46336918.slice/crio-93c8ae14be3dbfc04353df89bd2dfb0767a967cd98a0a2dbf593326e775d971e WatchSource:0}: Error finding container 93c8ae14be3dbfc04353df89bd2dfb0767a967cd98a0a2dbf593326e775d971e: Status 404 returned error can't find the container with id 93c8ae14be3dbfc04353df89bd2dfb0767a967cd98a0a2dbf593326e775d971e Apr 18 02:48:42.392291 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.392255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jmr6g" event={"ID":"cf5032f8-0827-4cc5-8381-d39ca8db84ee","Type":"ContainerStarted","Data":"556d0cade21baa256c5ac8db359c6f261db1c3661ae6ae40c253c0a0bedc7daf"} Apr 18 02:48:42.393135 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:42.393113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-59c6n" event={"ID":"3be6e6c6-5134-4428-888c-4efe46336918","Type":"ContainerStarted","Data":"93c8ae14be3dbfc04353df89bd2dfb0767a967cd98a0a2dbf593326e775d971e"} Apr 18 02:48:44.399713 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.399616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jmr6g" event={"ID":"cf5032f8-0827-4cc5-8381-d39ca8db84ee","Type":"ContainerStarted","Data":"e5d63e2ddb909152959921355cdf31e0ca52609f6cef92177bcc9baf6eba2acd"} Apr 18 02:48:44.399713 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.399659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jmr6g" event={"ID":"cf5032f8-0827-4cc5-8381-d39ca8db84ee","Type":"ContainerStarted","Data":"1b8205ab08a2df5dca27c117c8e0d55a2a6b03b7fe3bbb1fb8b9f106c6c3e266"} Apr 18 02:48:44.400172 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.399731 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:44.400969 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.400946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-59c6n" event={"ID":"3be6e6c6-5134-4428-888c-4efe46336918","Type":"ContainerStarted","Data":"4756485dc208949bfe61f5d9bdcbccc4892a7f9e7f0092b2e8c13dfbcfeb02cd"} Apr 18 02:48:44.416504 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.416463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jmr6g" podStartSLOduration=128.674808472 podStartE2EDuration="2m10.416452073s" podCreationTimestamp="2026-04-18 02:46:34 +0000 UTC" firstStartedPulling="2026-04-18 02:48:42.317803723 +0000 UTC m=+160.041357596" lastFinishedPulling="2026-04-18 02:48:44.059447324 +0000 UTC m=+161.783001197" observedRunningTime="2026-04-18 02:48:44.415187687 +0000 UTC m=+162.138741581" watchObservedRunningTime="2026-04-18 02:48:44.416452073 +0000 UTC m=+162.140005967" Apr 18 02:48:44.428996 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:44.428963 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-59c6n" podStartSLOduration=128.695749074 podStartE2EDuration="2m10.428952414s" podCreationTimestamp="2026-04-18 02:46:34 +0000 UTC" firstStartedPulling="2026-04-18 02:48:42.329839433 +0000 UTC m=+160.053393321" lastFinishedPulling="2026-04-18 02:48:44.063042771 +0000 UTC m=+161.786596661" observedRunningTime="2026-04-18 02:48:44.428417025 +0000 UTC m=+162.151970922" watchObservedRunningTime="2026-04-18 02:48:44.428952414 +0000 UTC m=+162.152506309" Apr 18 02:48:52.377227 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.377189 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dgzpp"] Apr 18 02:48:52.380395 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.380377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.385783 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.385761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 18 02:48:52.387020 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.386984 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 18 02:48:52.387130 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.387019 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d27jw\"" Apr 18 02:48:52.387130 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.387085 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 18 02:48:52.387246 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.387147 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 18 02:48:52.392450 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.392425 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dgzpp"] Apr 18 02:48:52.478048 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.478014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hkt\" (UniqueName: \"kubernetes.io/projected/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-api-access-r2hkt\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.478048 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.478053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.478261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.478081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-crio-socket\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.478261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.478103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.478261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.478171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-data-volume\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579253 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hkt\" (UniqueName: \"kubernetes.io/projected/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-api-access-r2hkt\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-crio-socket\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-data-volume\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579523 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-crio-socket\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579709 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-data-volume\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.579897 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.579877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.581676 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.581651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.592256 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.592228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hkt\" (UniqueName: \"kubernetes.io/projected/0881f3b0-2ddd-4bb1-8847-3636e2d0c097-kube-api-access-r2hkt\") pod \"insights-runtime-extractor-dgzpp\" (UID: \"0881f3b0-2ddd-4bb1-8847-3636e2d0c097\") " pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.688487 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.688402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dgzpp" Apr 18 02:48:52.803592 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.803558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dgzpp"] Apr 18 02:48:52.806566 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:48:52.806538 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0881f3b0_2ddd_4bb1_8847_3636e2d0c097.slice/crio-ec924510103f05d5c4f78c3e0cf9abcbc89a295e1bb774f6d1946a9c2168633b WatchSource:0}: Error finding container ec924510103f05d5c4f78c3e0cf9abcbc89a295e1bb774f6d1946a9c2168633b: Status 404 returned error can't find the container with id ec924510103f05d5c4f78c3e0cf9abcbc89a295e1bb774f6d1946a9c2168633b Apr 18 02:48:52.884023 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:52.883994 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:48:53.422542 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:53.422516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dgzpp" event={"ID":"0881f3b0-2ddd-4bb1-8847-3636e2d0c097","Type":"ContainerStarted","Data":"a8af9092a6da694ef26a58b1bd387e91a492f7ca7c34cb00b4f58a24679c4118"} Apr 18 02:48:53.422849 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:53.422549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dgzpp" event={"ID":"0881f3b0-2ddd-4bb1-8847-3636e2d0c097","Type":"ContainerStarted","Data":"ec924510103f05d5c4f78c3e0cf9abcbc89a295e1bb774f6d1946a9c2168633b"} Apr 18 02:48:54.405295 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:54.405260 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jmr6g" Apr 18 02:48:54.427188 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:54.427138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dgzpp" event={"ID":"0881f3b0-2ddd-4bb1-8847-3636e2d0c097","Type":"ContainerStarted","Data":"9002ee37092d48a985dd76ca939bc6d765ab2b1e647dad8e6c21ce160accb0c2"} Apr 18 02:48:54.428817 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:54.428791 2574 generic.go:358] "Generic (PLEG): container finished" podID="b4022571-563d-4009-8aa3-b53018b1f875" containerID="87eb5b256f36e362a81bdd1c7721c55c500581dc9f185d8c6c48150379ce51db" exitCode=255 Apr 18 02:48:54.428961 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:54.428823 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" event={"ID":"b4022571-563d-4009-8aa3-b53018b1f875","Type":"ContainerDied","Data":"87eb5b256f36e362a81bdd1c7721c55c500581dc9f185d8c6c48150379ce51db"} Apr 18 02:48:54.429238 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:54.429204 2574 scope.go:117] "RemoveContainer" containerID="87eb5b256f36e362a81bdd1c7721c55c500581dc9f185d8c6c48150379ce51db" Apr 18 02:48:55.432865 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:55.432831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dgzpp" event={"ID":"0881f3b0-2ddd-4bb1-8847-3636e2d0c097","Type":"ContainerStarted","Data":"cb4ed3b9173a3e35ecc3afc3071a7a34bba1cbbbff2cf1c356319395e0b69931"} Apr 18 02:48:55.434453 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:55.434432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb6d5f7fc-x2brp" event={"ID":"b4022571-563d-4009-8aa3-b53018b1f875","Type":"ContainerStarted","Data":"4c56787d2709d5b1ce7e373d8d5e33f2e12181afd0759b75409776f9c9bbfc6e"} Apr 18 02:48:55.450658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:55.450615 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dgzpp" podStartSLOduration=1.682197703 podStartE2EDuration="3.450602324s" podCreationTimestamp="2026-04-18 02:48:52 +0000 UTC" firstStartedPulling="2026-04-18 02:48:52.862490073 +0000 UTC m=+170.586043950" lastFinishedPulling="2026-04-18 02:48:54.630894683 +0000 UTC m=+172.354448571" observedRunningTime="2026-04-18 02:48:55.44897187 +0000 UTC m=+173.172525764" watchObservedRunningTime="2026-04-18 02:48:55.450602324 +0000 UTC m=+173.174156254" Apr 18 02:48:58.447580 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.447546 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh"] Apr 18 02:48:58.450717 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.450700 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.453751 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.453726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 18 02:48:58.453961 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.453946 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 18 02:48:58.454960 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.454941 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 18 02:48:58.455079 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.455060 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 18 02:48:58.455120 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.455060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 18 02:48:58.455186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.455169 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-m7tcz\"" Apr 18 02:48:58.462964 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.462943 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh"] Apr 18 02:48:58.481280 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.481249 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-x8zpl"] Apr 18 02:48:58.484349 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.484331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.486766 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.486743 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 18 02:48:58.486900 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.486885 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 18 02:48:58.486956 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.486899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pzdgw\"" Apr 18 02:48:58.486998 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.486987 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 18 02:48:58.520443 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10c0ceae-dfce-406f-879e-37341f5509ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.520443 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-textfile\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czgkf\" (UniqueName: \"kubernetes.io/projected/10c0ceae-dfce-406f-879e-37341f5509ae-kube-api-access-czgkf\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-root\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-sys\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-wtmp\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520613 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.520793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-metrics-client-ca\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.520793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.520775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nq4w\" (UniqueName: \"kubernetes.io/projected/f444b9d4-dc64-4f2e-af4b-e10735803a6c-kube-api-access-9nq4w\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.621917 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.621878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czgkf\" (UniqueName: \"kubernetes.io/projected/10c0ceae-dfce-406f-879e-37341f5509ae-kube-api-access-czgkf\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.621917 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.621919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-root\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.621939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.621998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-root\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-sys\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-wtmp\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622114 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-sys\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-metrics-client-ca\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-wtmp\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nq4w\" (UniqueName: \"kubernetes.io/projected/f444b9d4-dc64-4f2e-af4b-e10735803a6c-kube-api-access-9nq4w\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622418 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:58.622394 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 18 02:48:58.622758 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:48:58.622475 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls podName:f444b9d4-dc64-4f2e-af4b-e10735803a6c nodeName:}" failed. No retries permitted until 2026-04-18 02:48:59.122452459 +0000 UTC m=+176.846006348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls") pod "node-exporter-x8zpl" (UID: "f444b9d4-dc64-4f2e-af4b-e10735803a6c") : secret "node-exporter-tls" not found Apr 18 02:48:58.622758 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10c0ceae-dfce-406f-879e-37341f5509ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.622758 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-textfile\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.622944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.622925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-metrics-client-ca\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.623052 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.623030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-textfile\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.623170 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.623141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10c0ceae-dfce-406f-879e-37341f5509ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.623293 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.623275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.624626 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.624604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.624713 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.624640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10c0ceae-dfce-406f-879e-37341f5509ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.624754 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.624726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.632645 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.632622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czgkf\" (UniqueName: \"kubernetes.io/projected/10c0ceae-dfce-406f-879e-37341f5509ae-kube-api-access-czgkf\") pod \"openshift-state-metrics-9d44df66c-n2swh\" (UID: \"10c0ceae-dfce-406f-879e-37341f5509ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.632734 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.632720 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nq4w\" (UniqueName: \"kubernetes.io/projected/f444b9d4-dc64-4f2e-af4b-e10735803a6c-kube-api-access-9nq4w\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:58.759543 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.759506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" Apr 18 02:48:58.870867 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:58.870835 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh"] Apr 18 02:48:58.873751 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:48:58.873724 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c0ceae_dfce_406f_879e_37341f5509ae.slice/crio-0542b570c0e549bfdc480fb7a2cc7b18ca7e2d5916cfdf175ca4d91e6ed54b4e WatchSource:0}: Error finding container 0542b570c0e549bfdc480fb7a2cc7b18ca7e2d5916cfdf175ca4d91e6ed54b4e: Status 404 returned error can't find the container with id 0542b570c0e549bfdc480fb7a2cc7b18ca7e2d5916cfdf175ca4d91e6ed54b4e Apr 18 02:48:59.127224 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.127136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:59.129369 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.129344 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f444b9d4-dc64-4f2e-af4b-e10735803a6c-node-exporter-tls\") pod \"node-exporter-x8zpl\" (UID: \"f444b9d4-dc64-4f2e-af4b-e10735803a6c\") " pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:59.392686 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.392587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x8zpl" Apr 18 02:48:59.400939 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:48:59.400919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf444b9d4_dc64_4f2e_af4b_e10735803a6c.slice/crio-bb3ab8e49ed60e9f7a7dac5f7fdb8d538b4f0465da07ec31b6369cce34595ef6 WatchSource:0}: Error finding container bb3ab8e49ed60e9f7a7dac5f7fdb8d538b4f0465da07ec31b6369cce34595ef6: Status 404 returned error can't find the container with id bb3ab8e49ed60e9f7a7dac5f7fdb8d538b4f0465da07ec31b6369cce34595ef6 Apr 18 02:48:59.444341 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.444284 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x8zpl" event={"ID":"f444b9d4-dc64-4f2e-af4b-e10735803a6c","Type":"ContainerStarted","Data":"bb3ab8e49ed60e9f7a7dac5f7fdb8d538b4f0465da07ec31b6369cce34595ef6"} Apr 18 02:48:59.445665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.445637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" event={"ID":"10c0ceae-dfce-406f-879e-37341f5509ae","Type":"ContainerStarted","Data":"58734591de30a49689d5f6b7cd0cf5840e6bb8b8f27c95592f171c75aa742709"} Apr 18 02:48:59.445665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.445669 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" event={"ID":"10c0ceae-dfce-406f-879e-37341f5509ae","Type":"ContainerStarted","Data":"580da55699a1c3aef57086de7d0c3d443d37c4d490f313188e305518c780f189"} Apr 18 02:48:59.445820 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.445678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" event={"ID":"10c0ceae-dfce-406f-879e-37341f5509ae","Type":"ContainerStarted","Data":"0542b570c0e549bfdc480fb7a2cc7b18ca7e2d5916cfdf175ca4d91e6ed54b4e"} Apr 18 02:48:59.523632 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.523605 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:48:59.528020 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.528002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.530587 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.530566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 18 02:48:59.530793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.530695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 18 02:48:59.530793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.530695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 18 02:48:59.530793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.530739 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 18 02:48:59.531010 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.530850 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 18 02:48:59.531127 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.531111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 18 02:48:59.531216 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.531189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 18 02:48:59.531270 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.531235 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 18 02:48:59.531384 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.531369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dl4dx\"" Apr 18 02:48:59.531458 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.531429 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 18 02:48:59.537624 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.537605 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:48:59.630828 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.630793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.630994 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.630841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.630994 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.630866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.630994 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.630966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631182 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631182 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631182 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631182 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631410 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631410 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631410 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27tjh\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631410 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.631410 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.631350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732021 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.731989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732194 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732267 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.732466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.732383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27tjh\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.733466 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.733189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.733607 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.733503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.733676 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.733656 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.735293 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.735258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.735498 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.735458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.735779 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.735756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.736874 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.736654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.736874 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.736832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.737907 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.737478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.738204 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.738061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.738359 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.738338 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.738624 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.738605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.745056 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.745032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27tjh\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh\") pod \"alertmanager-main-0\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:48:59.836708 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:48:59.836679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:49:00.362236 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.362213 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:49:00.365096 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:49:00.365068 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5186b7_6ddb_49b0_82f9_e55360f1c0e2.slice/crio-b051c9f0ef23d90d1160e29f296831b522f72515e196aaeadeb8d45749fb64f7 WatchSource:0}: Error finding container b051c9f0ef23d90d1160e29f296831b522f72515e196aaeadeb8d45749fb64f7: Status 404 returned error can't find the container with id b051c9f0ef23d90d1160e29f296831b522f72515e196aaeadeb8d45749fb64f7 Apr 18 02:49:00.447811 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.447777 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-694d7b6dc8-42f9f"] Apr 18 02:49:00.451261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.451233 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" event={"ID":"10c0ceae-dfce-406f-879e-37341f5509ae","Type":"ContainerStarted","Data":"06e3c5363fb8a8b30ce601a7ed100cd05002005600488a99aaabba57deb9da59"} Apr 18 02:49:00.451261 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.451260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"b051c9f0ef23d90d1160e29f296831b522f72515e196aaeadeb8d45749fb64f7"} Apr 18 02:49:00.451467 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.451447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.452135 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.452108 2574 generic.go:358] "Generic (PLEG): container finished" podID="f444b9d4-dc64-4f2e-af4b-e10735803a6c" containerID="d18096eed01de67ca9acb556db88461cfab662347d2df21d7ee490169d810f5d" exitCode=0 Apr 18 02:49:00.452219 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.452166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x8zpl" event={"ID":"f444b9d4-dc64-4f2e-af4b-e10735803a6c","Type":"ContainerDied","Data":"d18096eed01de67ca9acb556db88461cfab662347d2df21d7ee490169d810f5d"} Apr 18 02:49:00.456339 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.456291 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 18 02:49:00.456431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.456294 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 18 02:49:00.456574 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.456561 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 18 02:49:00.456760 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.456746 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 18 02:49:00.457380 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.457362 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-c85v0qgupl6cv\"" Apr 18 02:49:00.457464 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.457387 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 18 02:49:00.457838 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.457824 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wznvz\"" Apr 18 02:49:00.482961 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.482901 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-694d7b6dc8-42f9f"] Apr 18 02:49:00.492484 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.492445 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n2swh" podStartSLOduration=1.2561951040000001 podStartE2EDuration="2.492432872s" podCreationTimestamp="2026-04-18 02:48:58 +0000 UTC" firstStartedPulling="2026-04-18 02:48:58.999381925 +0000 UTC m=+176.722935802" lastFinishedPulling="2026-04-18 02:49:00.23561968 +0000 UTC m=+177.959173570" observedRunningTime="2026-04-18 02:49:00.491571963 +0000 UTC m=+178.215125883" watchObservedRunningTime="2026-04-18 02:49:00.492432872 +0000 UTC m=+178.215986766" Apr 18 02:49:00.539355 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad73780-435a-4c42-a33f-a16127970a0b-metrics-client-ca\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-grpc-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.539693 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.539677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj57n\" (UniqueName: \"kubernetes.io/projected/1ad73780-435a-4c42-a33f-a16127970a0b-kube-api-access-mj57n\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640244 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad73780-435a-4c42-a33f-a16127970a0b-metrics-client-ca\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-grpc-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640672 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.640672 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.640525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj57n\" (UniqueName: \"kubernetes.io/projected/1ad73780-435a-4c42-a33f-a16127970a0b-kube-api-access-mj57n\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.641126 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.641077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad73780-435a-4c42-a33f-a16127970a0b-metrics-client-ca\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.642945 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.642907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.643068 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.642973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.643258 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.643236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.643556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.643536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.643637 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.643616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-grpc-tls\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.643696 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.643680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad73780-435a-4c42-a33f-a16127970a0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.647411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.647388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj57n\" (UniqueName: \"kubernetes.io/projected/1ad73780-435a-4c42-a33f-a16127970a0b-kube-api-access-mj57n\") pod \"thanos-querier-694d7b6dc8-42f9f\" (UID: \"1ad73780-435a-4c42-a33f-a16127970a0b\") " pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.803370 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.803261 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:00.961754 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:00.961720 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-694d7b6dc8-42f9f"] Apr 18 02:49:00.965722 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:49:00.965674 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad73780_435a_4c42_a33f_a16127970a0b.slice/crio-ed9483ff6cdd8ca93228e28c0a5b0705131c3d48130b7c9d8b27ae5d458a7a2f WatchSource:0}: Error finding container ed9483ff6cdd8ca93228e28c0a5b0705131c3d48130b7c9d8b27ae5d458a7a2f: Status 404 returned error can't find the container with id ed9483ff6cdd8ca93228e28c0a5b0705131c3d48130b7c9d8b27ae5d458a7a2f Apr 18 02:49:01.457497 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.457444 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96" exitCode=0 Apr 18 02:49:01.457685 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.457526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96"} Apr 18 02:49:01.459673 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.459646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"ed9483ff6cdd8ca93228e28c0a5b0705131c3d48130b7c9d8b27ae5d458a7a2f"} Apr 18 02:49:01.461947 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.461913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x8zpl" event={"ID":"f444b9d4-dc64-4f2e-af4b-e10735803a6c","Type":"ContainerStarted","Data":"f839a16a1d68854c95f251a1fe7229b21bb478786fc6669d5a20699e6d88d329"} Apr 18 02:49:01.461947 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.461947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x8zpl" event={"ID":"f444b9d4-dc64-4f2e-af4b-e10735803a6c","Type":"ContainerStarted","Data":"e64b3de070ab933b4052d6ea773e983daf75d2ebad815503f1cbc1206b1306f9"} Apr 18 02:49:01.500720 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:01.500625 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-x8zpl" podStartSLOduration=2.66638272 podStartE2EDuration="3.500609607s" podCreationTimestamp="2026-04-18 02:48:58 +0000 UTC" firstStartedPulling="2026-04-18 02:48:59.402403932 +0000 UTC m=+177.125957805" lastFinishedPulling="2026-04-18 02:49:00.236630805 +0000 UTC m=+177.960184692" observedRunningTime="2026-04-18 02:49:01.500479288 +0000 UTC m=+179.224033185" watchObservedRunningTime="2026-04-18 02:49:01.500609607 +0000 UTC m=+179.224163503" Apr 18 02:49:03.470163 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.470126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c"} Apr 18 02:49:03.470163 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.470169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8"} Apr 18 02:49:03.470599 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.470182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801"} Apr 18 02:49:03.470599 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.470192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d"} Apr 18 02:49:03.470599 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.470200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a"} Apr 18 02:49:03.471844 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.471819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"6bd2ec8280b42f544b2e9fb8e683c774d26e3c43d762927e9dcf7b3e1ef3122f"} Apr 18 02:49:03.471911 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.471850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"c8d662c9bfa46883899130d8919a4fd6697424b5c56864cb566c3e7f9c104b9e"} Apr 18 02:49:03.471911 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:03.471859 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"a0a2f7303654cb5f07d670e246c6fb2b1c2104635e1dc687b87a6c7f7a5a4313"} Apr 18 02:49:04.477076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.477043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerStarted","Data":"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef"} Apr 18 02:49:04.479404 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.479373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"ba9eb0794d3e376d9d2a4cea9b4428d1d02f6fbfb5f97222600f45e5383cf51b"} Apr 18 02:49:04.479533 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.479409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"fa9e8e544c3f6a9587b8711f11688ff715fb7945046e91c67f377508c65eb4a3"} Apr 18 02:49:04.479533 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.479423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" event={"ID":"1ad73780-435a-4c42-a33f-a16127970a0b","Type":"ContainerStarted","Data":"7ea52d3115d7dd15f90765e71b46399c5919aefe4b228146a7536285ec67fab3"} Apr 18 02:49:04.479598 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.479533 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:04.502854 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.502807 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.855219731 podStartE2EDuration="5.50279219s" podCreationTimestamp="2026-04-18 02:48:59 +0000 UTC" firstStartedPulling="2026-04-18 02:49:00.367044456 +0000 UTC m=+178.090598333" lastFinishedPulling="2026-04-18 02:49:04.014616919 +0000 UTC m=+181.738170792" observedRunningTime="2026-04-18 02:49:04.50082116 +0000 UTC m=+182.224375054" watchObservedRunningTime="2026-04-18 02:49:04.50279219 +0000 UTC m=+182.226346084" Apr 18 02:49:04.522038 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:04.521988 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" podStartSLOduration=1.476513453 podStartE2EDuration="4.521971626s" podCreationTimestamp="2026-04-18 02:49:00 +0000 UTC" firstStartedPulling="2026-04-18 02:49:00.967792484 +0000 UTC m=+178.691346357" lastFinishedPulling="2026-04-18 02:49:04.013250657 +0000 UTC m=+181.736804530" observedRunningTime="2026-04-18 02:49:04.520820696 +0000 UTC m=+182.244374592" watchObservedRunningTime="2026-04-18 02:49:04.521971626 +0000 UTC m=+182.245525522" Apr 18 02:49:07.227663 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.227630 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-768kk"] Apr 18 02:49:07.230937 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.230918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:07.233423 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.233395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 18 02:49:07.233556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.233449 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 18 02:49:07.233556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.233543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hhhpf\"" Apr 18 02:49:07.237566 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.237545 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-768kk"] Apr 18 02:49:07.295800 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.295767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7nw\" (UniqueName: \"kubernetes.io/projected/3ac514cd-745b-418c-99dc-a1392887694c-kube-api-access-dp7nw\") pod \"downloads-6bcc868b7-768kk\" (UID: \"3ac514cd-745b-418c-99dc-a1392887694c\") " pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:07.396938 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.396903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7nw\" (UniqueName: \"kubernetes.io/projected/3ac514cd-745b-418c-99dc-a1392887694c-kube-api-access-dp7nw\") pod \"downloads-6bcc868b7-768kk\" (UID: \"3ac514cd-745b-418c-99dc-a1392887694c\") " pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:07.411212 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.411180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7nw\" (UniqueName: \"kubernetes.io/projected/3ac514cd-745b-418c-99dc-a1392887694c-kube-api-access-dp7nw\") pod \"downloads-6bcc868b7-768kk\" (UID: \"3ac514cd-745b-418c-99dc-a1392887694c\") " pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:07.540723 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.540637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:07.662020 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:07.661995 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-768kk"] Apr 18 02:49:07.664271 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:49:07.664245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac514cd_745b_418c_99dc_a1392887694c.slice/crio-357f22e613b9badbaee73c32e0ed18bc4b19148eb4803684a0fb427a5504f179 WatchSource:0}: Error finding container 357f22e613b9badbaee73c32e0ed18bc4b19148eb4803684a0fb427a5504f179: Status 404 returned error can't find the container with id 357f22e613b9badbaee73c32e0ed18bc4b19148eb4803684a0fb427a5504f179 Apr 18 02:49:08.493242 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:08.493195 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-768kk" event={"ID":"3ac514cd-745b-418c-99dc-a1392887694c","Type":"ContainerStarted","Data":"357f22e613b9badbaee73c32e0ed18bc4b19148eb4803684a0fb427a5504f179"} Apr 18 02:49:10.489565 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:10.489536 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-694d7b6dc8-42f9f" Apr 18 02:49:23.539373 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:23.539332 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-768kk" event={"ID":"3ac514cd-745b-418c-99dc-a1392887694c","Type":"ContainerStarted","Data":"ebb25679a28350d19bedf86268e84f09edad4249174da1ee3281f97a740ebc3c"} Apr 18 02:49:23.539874 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:23.539613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:23.553326 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:23.553282 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-768kk" Apr 18 02:49:23.571314 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:23.571240 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-768kk" podStartSLOduration=1.484024281 podStartE2EDuration="16.571223537s" podCreationTimestamp="2026-04-18 02:49:07 +0000 UTC" firstStartedPulling="2026-04-18 02:49:07.666206855 +0000 UTC m=+185.389760729" lastFinishedPulling="2026-04-18 02:49:22.753406099 +0000 UTC m=+200.476959985" observedRunningTime="2026-04-18 02:49:23.555831642 +0000 UTC m=+201.279385536" watchObservedRunningTime="2026-04-18 02:49:23.571223537 +0000 UTC m=+201.294777433" Apr 18 02:49:29.715273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:29.715204 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" podUID="fa6d6c53-f01d-4cd5-9500-77ed9a842e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 18 02:49:39.715052 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:39.715012 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" podUID="fa6d6c53-f01d-4cd5-9500-77ed9a842e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 18 02:49:49.715587 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:49.715539 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" podUID="fa6d6c53-f01d-4cd5-9500-77ed9a842e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 18 02:49:49.716159 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:49.715631 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" Apr 18 02:49:49.716270 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:49.716228 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a4e435830195848828b6ee246e8e7544cc266b9cf0be9df8857aaa486fea7fc0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 18 02:49:49.716350 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:49.716296 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" podUID="fa6d6c53-f01d-4cd5-9500-77ed9a842e54" containerName="service-proxy" containerID="cri-o://a4e435830195848828b6ee246e8e7544cc266b9cf0be9df8857aaa486fea7fc0" gracePeriod=30 Apr 18 02:49:50.617808 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:50.617771 2574 generic.go:358] "Generic (PLEG): container finished" podID="fa6d6c53-f01d-4cd5-9500-77ed9a842e54" containerID="a4e435830195848828b6ee246e8e7544cc266b9cf0be9df8857aaa486fea7fc0" exitCode=2 Apr 18 02:49:50.617978 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:50.617821 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerDied","Data":"a4e435830195848828b6ee246e8e7544cc266b9cf0be9df8857aaa486fea7fc0"} Apr 18 02:49:50.617978 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:49:50.617855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55477c477-tsqd6" event={"ID":"fa6d6c53-f01d-4cd5-9500-77ed9a842e54","Type":"ContainerStarted","Data":"867439f63c6df77801481e40204460431424f8c843eec31bb934e37651485ca1"} Apr 18 02:50:13.668907 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:13.668804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:50:13.671049 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:13.671026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf422fa-fd33-491a-b182-991116468c18-metrics-certs\") pod \"network-metrics-daemon-c6w8h\" (UID: \"eaf422fa-fd33-491a-b182-991116468c18\") " pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:50:13.887004 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:13.886971 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h2hff\"" Apr 18 02:50:13.895008 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:13.894988 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6w8h" Apr 18 02:50:14.013070 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:14.012984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6w8h"] Apr 18 02:50:14.017706 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:50:14.017537 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf422fa_fd33_491a_b182_991116468c18.slice/crio-e1ba883515399da4153b7a089698abc9630a8b12384f01a1ec24e8caf9314f8c WatchSource:0}: Error finding container e1ba883515399da4153b7a089698abc9630a8b12384f01a1ec24e8caf9314f8c: Status 404 returned error can't find the container with id e1ba883515399da4153b7a089698abc9630a8b12384f01a1ec24e8caf9314f8c Apr 18 02:50:14.680738 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:14.680708 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6w8h" event={"ID":"eaf422fa-fd33-491a-b182-991116468c18","Type":"ContainerStarted","Data":"e1ba883515399da4153b7a089698abc9630a8b12384f01a1ec24e8caf9314f8c"} Apr 18 02:50:15.686343 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:15.686288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6w8h" event={"ID":"eaf422fa-fd33-491a-b182-991116468c18","Type":"ContainerStarted","Data":"eb11d63ef3398805387abca24f67b2dfda882af7c5bd1ed9db8a78b7e55c3dd5"} Apr 18 02:50:15.686343 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:15.686345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6w8h" event={"ID":"eaf422fa-fd33-491a-b182-991116468c18","Type":"ContainerStarted","Data":"e7ea922ad9726a7f99b93907939eb0f92e32c63bc08381e37133211929484487"} Apr 18 02:50:15.700985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:15.700935 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c6w8h" podStartSLOduration=252.749625447 podStartE2EDuration="4m13.700920904s" podCreationTimestamp="2026-04-18 02:46:02 +0000 UTC" firstStartedPulling="2026-04-18 02:50:14.019067206 +0000 UTC m=+251.742621080" lastFinishedPulling="2026-04-18 02:50:14.970362662 +0000 UTC m=+252.693916537" observedRunningTime="2026-04-18 02:50:15.699805643 +0000 UTC m=+253.423359538" watchObservedRunningTime="2026-04-18 02:50:15.700920904 +0000 UTC m=+253.424474799" Apr 18 02:50:18.717578 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.717543 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:18.717989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.717952 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="alertmanager" containerID="cri-o://a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a" gracePeriod=120 Apr 18 02:50:18.717989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.717964 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-metric" containerID="cri-o://6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c" gracePeriod=120 Apr 18 02:50:18.718108 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.717997 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-web" containerID="cri-o://f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801" gracePeriod=120 Apr 18 02:50:18.718108 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.718027 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="config-reloader" containerID="cri-o://8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d" gracePeriod=120 Apr 18 02:50:18.718108 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.718069 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="prom-label-proxy" containerID="cri-o://3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef" gracePeriod=120 Apr 18 02:50:18.718249 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:18.718052 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy" containerID="cri-o://37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8" gracePeriod=120 Apr 18 02:50:19.700623 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700592 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef" exitCode=0 Apr 18 02:50:19.700623 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700618 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c" exitCode=0 Apr 18 02:50:19.700623 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700624 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8" exitCode=0 Apr 18 02:50:19.700623 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700630 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d" exitCode=0 Apr 18 02:50:19.700623 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700635 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a" exitCode=0 Apr 18 02:50:19.700986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef"} Apr 18 02:50:19.700986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700693 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c"} Apr 18 02:50:19.700986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700703 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8"} Apr 18 02:50:19.700986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700712 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d"} Apr 18 02:50:19.700986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.700723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a"} Apr 18 02:50:19.963535 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:19.963509 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.008096 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008064 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008102 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008122 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008152 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008169 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008206 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008247 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008240 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008268 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008293 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008335 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27tjh\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.008658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.008369 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.009445 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.009414 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:20.010042 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.009991 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:50:20.011209 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.011161 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.011641 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.011601 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.011818 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.011782 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.012374 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.012336 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.013107 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.013071 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh" (OuterVolumeSpecName: "kube-api-access-27tjh") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "kube-api-access-27tjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:50:20.013421 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.013277 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.013421 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.013382 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:50:20.013581 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.013544 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out" (OuterVolumeSpecName: "config-out") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:50:20.015795 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.015767 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.021805 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.021665 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config" (OuterVolumeSpecName: "web-config") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 18 02:50:20.109191 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle\") pod \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\" (UID: \"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2\") " Apr 18 02:50:20.109389 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109358 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-web-config\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109389 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109377 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-main-db\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109393 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109408 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-volume\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109423 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-tls-assets\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109435 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-config-out\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109448 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-metrics-client-ca\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109489 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27tjh\" (UniqueName: \"kubernetes.io/projected/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-kube-api-access-27tjh\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109508 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109493 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" (UID: "3f5186b7-6ddb-49b0-82f9-e55360f1c0e2"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:50:20.109740 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109504 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109740 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109536 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-cluster-tls-config\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109740 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109546 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-main-tls\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.109740 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.109557 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.210282 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.210198 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:50:20.705374 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.705342 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerID="f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801" exitCode=0 Apr 18 02:50:20.705555 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.705388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801"} Apr 18 02:50:20.705555 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.705411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3f5186b7-6ddb-49b0-82f9-e55360f1c0e2","Type":"ContainerDied","Data":"b051c9f0ef23d90d1160e29f296831b522f72515e196aaeadeb8d45749fb64f7"} Apr 18 02:50:20.705555 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.705426 2574 scope.go:117] "RemoveContainer" containerID="3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef" Apr 18 02:50:20.705555 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.705456 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.713101 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.713073 2574 scope.go:117] "RemoveContainer" containerID="6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c" Apr 18 02:50:20.719651 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.719632 2574 scope.go:117] "RemoveContainer" containerID="37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8" Apr 18 02:50:20.725864 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.725844 2574 scope.go:117] "RemoveContainer" containerID="f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801" Apr 18 02:50:20.728186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.728166 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:20.732081 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.732067 2574 scope.go:117] "RemoveContainer" containerID="8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d" Apr 18 02:50:20.736423 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.736403 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:20.738862 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.738845 2574 scope.go:117] "RemoveContainer" containerID="a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a" Apr 18 02:50:20.746551 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.746525 2574 scope.go:117] "RemoveContainer" containerID="73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96" Apr 18 02:50:20.752491 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.752475 2574 scope.go:117] "RemoveContainer" containerID="3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef" Apr 18 02:50:20.752724 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.752705 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef\": container with ID starting with 3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef not found: ID does not exist" containerID="3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef" Apr 18 02:50:20.752777 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.752732 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef"} err="failed to get container status \"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef\": rpc error: code = NotFound desc = could not find container \"3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef\": container with ID starting with 3ba81b428eef68ef8a47eeba4757be60e6c621bdf4a2f908452acd311f47b1ef not found: ID does not exist" Apr 18 02:50:20.752777 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.752750 2574 scope.go:117] "RemoveContainer" containerID="6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c" Apr 18 02:50:20.752953 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.752935 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c\": container with ID starting with 6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c not found: ID does not exist" containerID="6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c" Apr 18 02:50:20.753008 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.752965 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c"} err="failed to get container status \"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c\": rpc error: code = NotFound desc = could not find container \"6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c\": container with ID starting with 6b25f36718a6c203162e5f1c2fc2b3bf1dbfa8b30e62f4f9c8ac9e865032133c not found: ID does not exist" Apr 18 02:50:20.753008 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.752992 2574 scope.go:117] "RemoveContainer" containerID="37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8" Apr 18 02:50:20.753170 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.753154 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8\": container with ID starting with 37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8 not found: ID does not exist" containerID="37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8" Apr 18 02:50:20.753227 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753174 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8"} err="failed to get container status \"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8\": rpc error: code = NotFound desc = could not find container \"37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8\": container with ID starting with 37fcd24619d108339690948bf237d51dbf00fe90388a5771a136339a77fdb2d8 not found: ID does not exist" Apr 18 02:50:20.753227 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753188 2574 scope.go:117] "RemoveContainer" containerID="f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801" Apr 18 02:50:20.753484 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.753469 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801\": container with ID starting with f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801 not found: ID does not exist" containerID="f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801" Apr 18 02:50:20.753542 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753487 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801"} err="failed to get container status \"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801\": rpc error: code = NotFound desc = could not find container \"f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801\": container with ID starting with f717401a587eb78905092f6bdc17599cb619c96b425e462536efee438f64f801 not found: ID does not exist" Apr 18 02:50:20.753542 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753499 2574 scope.go:117] "RemoveContainer" containerID="8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d" Apr 18 02:50:20.753715 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.753699 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d\": container with ID starting with 8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d not found: ID does not exist" containerID="8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d" Apr 18 02:50:20.753762 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753718 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d"} err="failed to get container status \"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d\": rpc error: code = NotFound desc = could not find container \"8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d\": container with ID starting with 8ea81b9d1c8a8adc2408878dff4f227b828258df8b467a6b48f4b699caac970d not found: ID does not exist" Apr 18 02:50:20.753762 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753731 2574 scope.go:117] "RemoveContainer" containerID="a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a" Apr 18 02:50:20.753948 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.753932 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a\": container with ID starting with a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a not found: ID does not exist" containerID="a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a" Apr 18 02:50:20.753985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753953 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a"} err="failed to get container status \"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a\": rpc error: code = NotFound desc = could not find container \"a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a\": container with ID starting with a2b60f59924afb081d86d2286d22b77a01c0fe1fc321793cdca4e5e86d90cd6a not found: ID does not exist" Apr 18 02:50:20.753985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.753968 2574 scope.go:117] "RemoveContainer" containerID="73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96" Apr 18 02:50:20.754187 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:50:20.754171 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96\": container with ID starting with 73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96 not found: ID does not exist" containerID="73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96" Apr 18 02:50:20.754232 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.754191 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96"} err="failed to get container status \"73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96\": rpc error: code = NotFound desc = could not find container \"73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96\": container with ID starting with 73b6bab90cccb588a659f83bc209a06ce148f6b1b7e4fcc0b53937293b93cd96 not found: ID does not exist" Apr 18 02:50:20.776967 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.776940 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:20.777315 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777274 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-metric" Apr 18 02:50:20.777315 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777294 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-metric" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777327 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777335 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777350 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="init-config-reloader" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777358 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="init-config-reloader" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777375 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-web" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777383 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-web" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777400 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="config-reloader" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777408 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="config-reloader" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777417 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="prom-label-proxy" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777425 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="prom-label-proxy" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777436 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="alertmanager" Apr 18 02:50:20.777482 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777444 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="alertmanager" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777517 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="config-reloader" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777531 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-web" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777542 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="alertmanager" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777552 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="prom-label-proxy" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777564 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy" Apr 18 02:50:20.778144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.777573 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" containerName="kube-rbac-proxy-metric" Apr 18 02:50:20.782960 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.782942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.785742 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 18 02:50:20.785839 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785737 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 18 02:50:20.785839 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785728 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 18 02:50:20.785839 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 18 02:50:20.785980 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785854 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 18 02:50:20.785980 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785923 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 18 02:50:20.785980 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785924 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 18 02:50:20.785980 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.785975 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 18 02:50:20.786400 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.786384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dl4dx\"" Apr 18 02:50:20.791364 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.791321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 18 02:50:20.793570 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.793550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:20.814450 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-out\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-web-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814507 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5th\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-kube-api-access-gg5th\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814618 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-volume\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.814850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.814787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.881759 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.881726 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5186b7-6ddb-49b0-82f9-e55360f1c0e2" path="/var/lib/kubelet/pods/3f5186b7-6ddb-49b0-82f9-e55360f1c0e2/volumes" Apr 18 02:50:20.915379 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915479 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5th\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-kube-api-access-gg5th\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915479 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915479 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-volume\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915479 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915479 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-out\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-web-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.915706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.915642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.916116 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.916092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.916359 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.916332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.916958 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.916885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.918634 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.918553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-volume\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.918634 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.918596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.918766 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.918697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.918980 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.918955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.919083 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.919059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.919168 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.919150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.919226 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.919209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-config-out\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.919873 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.919851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.920202 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.920188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-web-config\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:20.924115 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:20.924091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5th\" (UniqueName: \"kubernetes.io/projected/b0bb71d9-00ac-459d-ae0e-8903fc82fbac-kube-api-access-gg5th\") pod \"alertmanager-main-0\" (UID: \"b0bb71d9-00ac-459d-ae0e-8903fc82fbac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:21.099148 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:21.099113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 18 02:50:21.220365 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:21.220335 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 18 02:50:21.223202 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:50:21.223176 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0bb71d9_00ac_459d_ae0e_8903fc82fbac.slice/crio-fa6afe9837b8db5055ab22318f3114e6610d9a160632531918bf51ad0ac59137 WatchSource:0}: Error finding container fa6afe9837b8db5055ab22318f3114e6610d9a160632531918bf51ad0ac59137: Status 404 returned error can't find the container with id fa6afe9837b8db5055ab22318f3114e6610d9a160632531918bf51ad0ac59137 Apr 18 02:50:21.709887 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:21.709849 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0bb71d9-00ac-459d-ae0e-8903fc82fbac" containerID="04f2671a750d7c2f7be932b0682caece7200680fbbbd76aa7f74ea231be302fd" exitCode=0 Apr 18 02:50:21.710034 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:21.709915 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerDied","Data":"04f2671a750d7c2f7be932b0682caece7200680fbbbd76aa7f74ea231be302fd"} Apr 18 02:50:21.710034 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:21.709939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"fa6afe9837b8db5055ab22318f3114e6610d9a160632531918bf51ad0ac59137"} Apr 18 02:50:22.715639 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"c53a3a81aa716dd9921bbd636cad4c067d1458e22f5419544414ac81bf03df4d"} Apr 18 02:50:22.715989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"43df2e43c553e7099bf518620b5409341f60079a087b934300de010ffdbeeb5b"} Apr 18 02:50:22.715989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715661 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"b1db8cd51838fdcaaa7da88f1f660d808d23fb6f4feed645d84406efb42c3215"} Apr 18 02:50:22.715989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"326890b240a2820662530b7a725bfb6d342426194a45ef617091a3085bee9903"} Apr 18 02:50:22.715989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"41c4644192b3739b818eb595c6e7fabf9a00157cf73e4b479364df99eb81fc4c"} Apr 18 02:50:22.715989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.715696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b0bb71d9-00ac-459d-ae0e-8903fc82fbac","Type":"ContainerStarted","Data":"a33de837c32be7e26cae33ea6f4016cf30b6ec9c2def5b0cf78ac5ff386fde92"} Apr 18 02:50:22.735154 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.735124 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5dfccf456c-x8f55"] Apr 18 02:50:22.738511 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.738495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.741124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741090 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-cf4v6\"" Apr 18 02:50:22.741124 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741113 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 18 02:50:22.741295 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741176 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 18 02:50:22.741495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741479 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 18 02:50:22.741556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 18 02:50:22.741556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.741541 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 18 02:50:22.745734 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.745691 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.745680001 podStartE2EDuration="2.745680001s" podCreationTimestamp="2026-04-18 02:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:50:22.744987328 +0000 UTC m=+260.468541239" watchObservedRunningTime="2026-04-18 02:50:22.745680001 +0000 UTC m=+260.469233896" Apr 18 02:50:22.747582 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.747557 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 18 02:50:22.748635 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.748616 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dfccf456c-x8f55"] Apr 18 02:50:22.830175 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-federate-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfgn\" (UniqueName: \"kubernetes.io/projected/f58fbfc9-fb74-4fef-a934-0dc44a590232-kube-api-access-4pfgn\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-metrics-client-ca\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830411 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.830651 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.830470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-serving-certs-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.930815 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.930815 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfgn\" (UniqueName: \"kubernetes.io/projected/f58fbfc9-fb74-4fef-a934-0dc44a590232-kube-api-access-4pfgn\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-metrics-client-ca\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.930991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-serving-certs-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931076 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.931025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-federate-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931761 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.931732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-metrics-client-ca\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931890 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.931732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-serving-certs-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.931890 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.931828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.933405 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.933383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-telemeter-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.933749 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.933729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.933820 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.933734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.933895 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.933878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f58fbfc9-fb74-4fef-a934-0dc44a590232-federate-client-tls\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:22.938019 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:22.937997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfgn\" (UniqueName: \"kubernetes.io/projected/f58fbfc9-fb74-4fef-a934-0dc44a590232-kube-api-access-4pfgn\") pod \"telemeter-client-5dfccf456c-x8f55\" (UID: \"f58fbfc9-fb74-4fef-a934-0dc44a590232\") " pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:23.049252 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:23.049222 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" Apr 18 02:50:23.167172 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:23.167068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dfccf456c-x8f55"] Apr 18 02:50:23.169313 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:50:23.169277 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58fbfc9_fb74_4fef_a934_0dc44a590232.slice/crio-f0f7589c14430b2c4acd63a9029a8ec3eb8bbb5380102a96d99bc71ee5e45697 WatchSource:0}: Error finding container f0f7589c14430b2c4acd63a9029a8ec3eb8bbb5380102a96d99bc71ee5e45697: Status 404 returned error can't find the container with id f0f7589c14430b2c4acd63a9029a8ec3eb8bbb5380102a96d99bc71ee5e45697 Apr 18 02:50:23.719600 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:23.719558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" event={"ID":"f58fbfc9-fb74-4fef-a934-0dc44a590232","Type":"ContainerStarted","Data":"f0f7589c14430b2c4acd63a9029a8ec3eb8bbb5380102a96d99bc71ee5e45697"} Apr 18 02:50:25.728583 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:25.728548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" event={"ID":"f58fbfc9-fb74-4fef-a934-0dc44a590232","Type":"ContainerStarted","Data":"7edea3892cf40c3c30341fa4ea4c60fa4ef84c49b0792dc5a706aaad8f21a1ab"} Apr 18 02:50:25.728583 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:25.728587 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" event={"ID":"f58fbfc9-fb74-4fef-a934-0dc44a590232","Type":"ContainerStarted","Data":"adc1ffc10254362a779f524d9f190be16a7271858acac18745d9d55f15b77195"} Apr 18 02:50:25.729057 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:25.728599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" event={"ID":"f58fbfc9-fb74-4fef-a934-0dc44a590232","Type":"ContainerStarted","Data":"47ec8c663b1478825819c6b4afcf13e61de5d845255ca3b71e9e9ea4ea87ba4e"} Apr 18 02:50:25.748274 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:50:25.748230 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5dfccf456c-x8f55" podStartSLOduration=2.113038421 podStartE2EDuration="3.74821733s" podCreationTimestamp="2026-04-18 02:50:22 +0000 UTC" firstStartedPulling="2026-04-18 02:50:23.171158263 +0000 UTC m=+260.894712141" lastFinishedPulling="2026-04-18 02:50:24.806337174 +0000 UTC m=+262.529891050" observedRunningTime="2026-04-18 02:50:25.74689128 +0000 UTC m=+263.470445174" watchObservedRunningTime="2026-04-18 02:50:25.74821733 +0000 UTC m=+263.471771224" Apr 18 02:51:02.775335 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:51:02.775285 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:51:02.776532 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:51:02.776502 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:51:02.782420 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:51:02.782404 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 18 02:52:26.622706 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.622655 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mnwcq"] Apr 18 02:52:26.625337 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.625318 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.627536 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.627516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 18 02:52:26.631906 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.631885 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mnwcq"] Apr 18 02:52:26.675233 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.675201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-dbus\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.675392 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.675262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-kubelet-config\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.675392 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.675351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-original-pull-secret\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.776671 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.776636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-kubelet-config\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.776819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.776688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-original-pull-secret\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.776819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.776738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-dbus\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.776819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.776767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-kubelet-config\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.776932 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.776892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-dbus\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.779070 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.779043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1fc92ab-29d4-4bbd-b21d-46be592b4ea0-original-pull-secret\") pod \"global-pull-secret-syncer-mnwcq\" (UID: \"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0\") " pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:26.935002 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:26.934915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnwcq" Apr 18 02:52:27.044437 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:27.044408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mnwcq"] Apr 18 02:52:27.048004 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:52:27.047978 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fc92ab_29d4_4bbd_b21d_46be592b4ea0.slice/crio-e4e5886cd1dea378ddc9938172f708f014b96ad42530473ba2ef932f58887308 WatchSource:0}: Error finding container e4e5886cd1dea378ddc9938172f708f014b96ad42530473ba2ef932f58887308: Status 404 returned error can't find the container with id e4e5886cd1dea378ddc9938172f708f014b96ad42530473ba2ef932f58887308 Apr 18 02:52:27.049597 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:27.049581 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 02:52:27.059107 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:27.059082 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mnwcq" event={"ID":"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0","Type":"ContainerStarted","Data":"e4e5886cd1dea378ddc9938172f708f014b96ad42530473ba2ef932f58887308"} Apr 18 02:52:31.072616 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:31.072570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mnwcq" event={"ID":"b1fc92ab-29d4-4bbd-b21d-46be592b4ea0","Type":"ContainerStarted","Data":"b01148c711d174463c2898e9018100aca82e58852a64961a0f21f076f741eaba"} Apr 18 02:52:31.086478 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:52:31.086432 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mnwcq" podStartSLOduration=1.277914614 podStartE2EDuration="5.086417949s" podCreationTimestamp="2026-04-18 02:52:26 +0000 UTC" firstStartedPulling="2026-04-18 02:52:27.049735745 +0000 UTC m=+384.773289619" lastFinishedPulling="2026-04-18 02:52:30.85823908 +0000 UTC m=+388.581792954" observedRunningTime="2026-04-18 02:52:31.085158167 +0000 UTC m=+388.808712062" watchObservedRunningTime="2026-04-18 02:52:31.086417949 +0000 UTC m=+388.809971840" Apr 18 02:53:12.551397 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.551319 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jwxps"] Apr 18 02:53:12.554795 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.554776 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.557253 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.557230 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 18 02:53:12.558338 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.558321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 18 02:53:12.558395 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.558326 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-gbpbc\"" Apr 18 02:53:12.563511 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.563485 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jwxps"] Apr 18 02:53:12.641973 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.641938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.642148 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.642007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmjh\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-kube-api-access-xbmjh\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.742804 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.742753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.742972 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.742834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmjh\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-kube-api-access-xbmjh\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.751772 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.751738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.751885 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.751856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmjh\" (UniqueName: \"kubernetes.io/projected/39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c-kube-api-access-xbmjh\") pod \"cert-manager-webhook-597b96b99b-jwxps\" (UID: \"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.864701 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.864607 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:12.983490 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:12.983462 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jwxps"] Apr 18 02:53:12.986087 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:53:12.986064 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39a50d53_48b7_4d7c_8c8e_3bfd7dc7126c.slice/crio-9a15610899994eff22b2939c9afa7285a9687d3af67a0d9e6848c951bc38780b WatchSource:0}: Error finding container 9a15610899994eff22b2939c9afa7285a9687d3af67a0d9e6848c951bc38780b: Status 404 returned error can't find the container with id 9a15610899994eff22b2939c9afa7285a9687d3af67a0d9e6848c951bc38780b Apr 18 02:53:13.190638 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:13.190555 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" event={"ID":"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c","Type":"ContainerStarted","Data":"9a15610899994eff22b2939c9afa7285a9687d3af67a0d9e6848c951bc38780b"} Apr 18 02:53:16.201468 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:16.201430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" event={"ID":"39a50d53-48b7-4d7c-8c8e-3bfd7dc7126c","Type":"ContainerStarted","Data":"64f9de690fa44a557ab8f456e5884fa5b3406d129129bac20c86dcf479d75730"} Apr 18 02:53:16.201837 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:16.201492 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:16.218465 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:16.218409 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" podStartSLOduration=1.091000314 podStartE2EDuration="4.218390614s" podCreationTimestamp="2026-04-18 02:53:12 +0000 UTC" firstStartedPulling="2026-04-18 02:53:12.988246863 +0000 UTC m=+430.711800737" lastFinishedPulling="2026-04-18 02:53:16.115637161 +0000 UTC m=+433.839191037" observedRunningTime="2026-04-18 02:53:16.217816472 +0000 UTC m=+433.941370379" watchObservedRunningTime="2026-04-18 02:53:16.218390614 +0000 UTC m=+433.941944511" Apr 18 02:53:22.206944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:22.206916 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jwxps" Apr 18 02:53:25.830901 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.830866 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-nmrqg"] Apr 18 02:53:25.834185 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.834166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:25.836686 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.836663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-8l85t\"" Apr 18 02:53:25.843695 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.843670 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nmrqg"] Apr 18 02:53:25.954199 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.954164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-bound-sa-token\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:25.954371 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:25.954226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db4zh\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-kube-api-access-db4zh\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.055281 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.055250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db4zh\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-kube-api-access-db4zh\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.055447 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.055312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-bound-sa-token\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.063942 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.063910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-bound-sa-token\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.064037 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.064023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db4zh\" (UniqueName: \"kubernetes.io/projected/d75a1810-b5f7-4a6e-a06b-3780a5293a7c-kube-api-access-db4zh\") pod \"cert-manager-759f64656b-nmrqg\" (UID: \"d75a1810-b5f7-4a6e-a06b-3780a5293a7c\") " pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.142981 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.142894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nmrqg" Apr 18 02:53:26.258526 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:26.258500 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nmrqg"] Apr 18 02:53:26.260963 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:53:26.260930 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd75a1810_b5f7_4a6e_a06b_3780a5293a7c.slice/crio-78f361b15ff973163d9722cdb6d0717a49516a2314db171a80e5a4f8a3df84c9 WatchSource:0}: Error finding container 78f361b15ff973163d9722cdb6d0717a49516a2314db171a80e5a4f8a3df84c9: Status 404 returned error can't find the container with id 78f361b15ff973163d9722cdb6d0717a49516a2314db171a80e5a4f8a3df84c9 Apr 18 02:53:27.235417 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:27.235381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nmrqg" event={"ID":"d75a1810-b5f7-4a6e-a06b-3780a5293a7c","Type":"ContainerStarted","Data":"54366a7b50d579ce2f4381d70e5d83d64825950923e7aafb4741d2dea6da5a2d"} Apr 18 02:53:27.235785 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:27.235426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nmrqg" event={"ID":"d75a1810-b5f7-4a6e-a06b-3780a5293a7c","Type":"ContainerStarted","Data":"78f361b15ff973163d9722cdb6d0717a49516a2314db171a80e5a4f8a3df84c9"} Apr 18 02:53:27.252481 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:27.252433 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-nmrqg" podStartSLOduration=2.252415991 podStartE2EDuration="2.252415991s" podCreationTimestamp="2026-04-18 02:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:53:27.250112641 +0000 UTC m=+444.973666547" watchObservedRunningTime="2026-04-18 02:53:27.252415991 +0000 UTC m=+444.975969886" Apr 18 02:53:34.658012 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.657977 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb"] Apr 18 02:53:34.661724 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.661704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.664106 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.664081 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 18 02:53:34.664262 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.664245 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 18 02:53:34.664422 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.664368 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x27k\"" Apr 18 02:53:34.664616 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.664597 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 18 02:53:34.664707 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.664620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 18 02:53:34.671705 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.671682 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb"] Apr 18 02:53:34.723921 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.723880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tlw\" (UniqueName: \"kubernetes.io/projected/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-kube-api-access-m5tlw\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.723921 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.723925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.724128 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.724025 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.824727 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.824689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.824898 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.824753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tlw\" (UniqueName: \"kubernetes.io/projected/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-kube-api-access-m5tlw\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.824898 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.824779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.827191 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.827160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.827349 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.827190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.833107 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.833085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tlw\" (UniqueName: \"kubernetes.io/projected/43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc-kube-api-access-m5tlw\") pod \"opendatahub-operator-controller-manager-b6bf46549-wxnvb\" (UID: \"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc\") " pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:34.972339 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:34.972220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:35.102545 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:35.102511 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb"] Apr 18 02:53:35.105810 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:53:35.105775 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a43cbc_36cb_4386_a2a6_c1d3b65dc1cc.slice/crio-ac776c73fdda872780342c53da6e508ace393515e237a1892bd1c3c8efd34c9c WatchSource:0}: Error finding container ac776c73fdda872780342c53da6e508ace393515e237a1892bd1c3c8efd34c9c: Status 404 returned error can't find the container with id ac776c73fdda872780342c53da6e508ace393515e237a1892bd1c3c8efd34c9c Apr 18 02:53:35.259290 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:35.259260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" event={"ID":"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc","Type":"ContainerStarted","Data":"ac776c73fdda872780342c53da6e508ace393515e237a1892bd1c3c8efd34c9c"} Apr 18 02:53:38.270736 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:38.270685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" event={"ID":"43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc","Type":"ContainerStarted","Data":"a77f74145dabd092de49c0631f5adadcd2b4d117669f63c33fcb37373def0cf7"} Apr 18 02:53:38.271121 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:38.270831 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:38.292743 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:38.292684 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" podStartSLOduration=1.758576519 podStartE2EDuration="4.292666333s" podCreationTimestamp="2026-04-18 02:53:34 +0000 UTC" firstStartedPulling="2026-04-18 02:53:35.107370959 +0000 UTC m=+452.830924833" lastFinishedPulling="2026-04-18 02:53:37.641460775 +0000 UTC m=+455.365014647" observedRunningTime="2026-04-18 02:53:38.29089601 +0000 UTC m=+456.014449906" watchObservedRunningTime="2026-04-18 02:53:38.292666333 +0000 UTC m=+456.016220229" Apr 18 02:53:49.275923 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:49.275893 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b6bf46549-wxnvb" Apr 18 02:53:56.727549 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.727512 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w"] Apr 18 02:53:56.735311 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.735275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.738158 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.738130 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 18 02:53:56.738158 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.738156 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 18 02:53:56.739248 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.739231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pnqpv\"" Apr 18 02:53:56.739542 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.739517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 18 02:53:56.739642 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.739618 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 18 02:53:56.739761 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.739666 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 18 02:53:56.749421 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.749397 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w"] Apr 18 02:53:56.810202 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.810174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.810380 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.810256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbmtf\" (UniqueName: \"kubernetes.io/projected/e939e98f-077c-4bc7-867b-395379862646-kube-api-access-sbmtf\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.810380 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.810325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.810380 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.810355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e939e98f-077c-4bc7-867b-395379862646-manager-config\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.910829 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.910786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbmtf\" (UniqueName: \"kubernetes.io/projected/e939e98f-077c-4bc7-867b-395379862646-kube-api-access-sbmtf\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.911027 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.910842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.911027 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.910870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e939e98f-077c-4bc7-867b-395379862646-manager-config\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.911027 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.910952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.911601 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.911571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e939e98f-077c-4bc7-867b-395379862646-manager-config\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.913415 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.913387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.913541 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.913507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e939e98f-077c-4bc7-867b-395379862646-metrics-cert\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:56.918887 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:56.918866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbmtf\" (UniqueName: \"kubernetes.io/projected/e939e98f-077c-4bc7-867b-395379862646-kube-api-access-sbmtf\") pod \"lws-controller-manager-5dd789dc9-m6q4w\" (UID: \"e939e98f-077c-4bc7-867b-395379862646\") " pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:57.045091 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:57.045003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:53:57.184761 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:57.184730 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w"] Apr 18 02:53:57.188319 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:53:57.188271 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode939e98f_077c_4bc7_867b_395379862646.slice/crio-8d7f244e77c103d7698567cca72462ad30529921e54ed313d9e9170125a1d6dc WatchSource:0}: Error finding container 8d7f244e77c103d7698567cca72462ad30529921e54ed313d9e9170125a1d6dc: Status 404 returned error can't find the container with id 8d7f244e77c103d7698567cca72462ad30529921e54ed313d9e9170125a1d6dc Apr 18 02:53:57.330288 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:53:57.330208 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" event={"ID":"e939e98f-077c-4bc7-867b-395379862646","Type":"ContainerStarted","Data":"8d7f244e77c103d7698567cca72462ad30529921e54ed313d9e9170125a1d6dc"} Apr 18 02:54:00.340913 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:00.340876 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" event={"ID":"e939e98f-077c-4bc7-867b-395379862646","Type":"ContainerStarted","Data":"db339a55573ba14214cef00840f5945bad20b5ec1f1e57552abacf36468543df"} Apr 18 02:54:00.341289 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:00.340925 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:54:00.357989 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:00.357933 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" podStartSLOduration=2.007535179 podStartE2EDuration="4.357914853s" podCreationTimestamp="2026-04-18 02:53:56 +0000 UTC" firstStartedPulling="2026-04-18 02:53:57.190071003 +0000 UTC m=+474.913624877" lastFinishedPulling="2026-04-18 02:53:59.540450658 +0000 UTC m=+477.264004551" observedRunningTime="2026-04-18 02:54:00.356841118 +0000 UTC m=+478.080395012" watchObservedRunningTime="2026-04-18 02:54:00.357914853 +0000 UTC m=+478.081468749" Apr 18 02:54:11.346431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:11.346400 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5dd789dc9-m6q4w" Apr 18 02:54:16.638841 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.638801 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:16.641289 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.641269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.643785 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.643763 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 18 02:54:16.643918 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.643901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-br2rn\"" Apr 18 02:54:16.643992 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.643931 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 18 02:54:16.644344 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.644321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 18 02:54:16.653524 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.653500 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:16.657851 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.657824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.657944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.657867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.657944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.657897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.657944 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.657928 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.658103 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.657969 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.658103 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.658069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.658189 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.658112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.658189 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.658136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs22w\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.658189 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.658165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759403 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs22w\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759567 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759954 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759954 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759801 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759954 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.759954 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.760116 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.759993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.760234 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.760214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.762025 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.762000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.762136 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.762092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.767244 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.767225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.767477 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.767457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs22w\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:16.953809 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:16.953718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:17.080067 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:17.080034 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:17.082960 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:54:17.082934 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f6f38f_15fa_44ca_956c_1a8a59a37f30.slice/crio-2d1366c4c3954f0b2637d61a332c77954fd029ed94f37b96a9c517a41c255893 WatchSource:0}: Error finding container 2d1366c4c3954f0b2637d61a332c77954fd029ed94f37b96a9c517a41c255893: Status 404 returned error can't find the container with id 2d1366c4c3954f0b2637d61a332c77954fd029ed94f37b96a9c517a41c255893 Apr 18 02:54:17.392389 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:17.392354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" event={"ID":"09f6f38f-15fa-44ca-956c-1a8a59a37f30","Type":"ContainerStarted","Data":"2d1366c4c3954f0b2637d61a332c77954fd029ed94f37b96a9c517a41c255893"} Apr 18 02:54:19.484643 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:19.484607 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 18 02:54:19.484890 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:19.484679 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 18 02:54:19.484890 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:19.484706 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 18 02:54:20.403265 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:20.403228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" event={"ID":"09f6f38f-15fa-44ca-956c-1a8a59a37f30","Type":"ContainerStarted","Data":"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e"} Apr 18 02:54:20.422227 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:20.422176 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" podStartSLOduration=2.022703866 podStartE2EDuration="4.422154145s" podCreationTimestamp="2026-04-18 02:54:16 +0000 UTC" firstStartedPulling="2026-04-18 02:54:17.084816568 +0000 UTC m=+494.808370440" lastFinishedPulling="2026-04-18 02:54:19.484266846 +0000 UTC m=+497.207820719" observedRunningTime="2026-04-18 02:54:20.420441623 +0000 UTC m=+498.143995519" watchObservedRunningTime="2026-04-18 02:54:20.422154145 +0000 UTC m=+498.145708043" Apr 18 02:54:20.954753 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:20.954719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:20.956115 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:20.956087 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" start-of-body= Apr 18 02:54:20.956241 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:20.956148 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" Apr 18 02:54:21.955005 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:21.954959 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" start-of-body= Apr 18 02:54:21.955495 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:21.955025 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" Apr 18 02:54:21.996854 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:21.996821 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:22.954090 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:22.954054 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" start-of-body= Apr 18 02:54:22.954277 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:22.954115 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.21:15021/healthz/ready\": dial tcp 10.134.0.21:15021: connect: connection refused" Apr 18 02:54:23.412075 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:23.412005 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" containerID="cri-o://2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e" gracePeriod=30 Apr 18 02:54:28.647878 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.647853 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:28.763724 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763674 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763724 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763730 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs22w\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763765 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763787 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763817 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763876 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763942 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.763986 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763980 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy\") pod \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\" (UID: \"09f6f38f-15fa-44ca-956c-1a8a59a37f30\") " Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.763989 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764089 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data" (OuterVolumeSpecName: "istio-data") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764205 2574 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-credential-socket\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764225 2574 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-data\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764241 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:28.764358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764267 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 18 02:54:28.764630 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.764372 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:28.766292 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.766264 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w" (OuterVolumeSpecName: "kube-api-access-zs22w") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "kube-api-access-zs22w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:54:28.766415 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.766316 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 18 02:54:28.766415 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.766363 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token" (OuterVolumeSpecName: "istio-token") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:54:28.766415 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.766366 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "09f6f38f-15fa-44ca-956c-1a8a59a37f30" (UID: "09f6f38f-15fa-44ca-956c-1a8a59a37f30"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:54:28.864948 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.864896 2574 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-envoy\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.864948 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.864945 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zs22w\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-kube-api-access-zs22w\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.865164 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.864962 2574 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istiod-ca-cert\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.865164 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.864979 2574 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-podinfo\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.865164 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.864990 2574 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/09f6f38f-15fa-44ca-956c-1a8a59a37f30-istio-token\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.865164 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.865004 2574 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-certs\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:28.865164 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:28.865015 2574 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/09f6f38f-15fa-44ca-956c-1a8a59a37f30-workload-socket\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:29.431536 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.431504 2574 generic.go:358] "Generic (PLEG): container finished" podID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerID="2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e" exitCode=0 Apr 18 02:54:29.431691 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.431553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" event={"ID":"09f6f38f-15fa-44ca-956c-1a8a59a37f30","Type":"ContainerDied","Data":"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e"} Apr 18 02:54:29.431691 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.431567 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" Apr 18 02:54:29.431691 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.431577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds" event={"ID":"09f6f38f-15fa-44ca-956c-1a8a59a37f30","Type":"ContainerDied","Data":"2d1366c4c3954f0b2637d61a332c77954fd029ed94f37b96a9c517a41c255893"} Apr 18 02:54:29.431691 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.431595 2574 scope.go:117] "RemoveContainer" containerID="2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e" Apr 18 02:54:29.440293 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.440271 2574 scope.go:117] "RemoveContainer" containerID="2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e" Apr 18 02:54:29.440549 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:54:29.440530 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e\": container with ID starting with 2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e not found: ID does not exist" containerID="2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e" Apr 18 02:54:29.440611 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.440555 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e"} err="failed to get container status \"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e\": rpc error: code = NotFound desc = could not find container \"2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e\": container with ID starting with 2ae63b52c94a9d48eea96aa0e0f3452b93649b5d089cc6a447be9762303b5d6e not found: ID does not exist" Apr 18 02:54:29.448456 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.448432 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:29.455341 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:29.452459 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77zb8ds"] Apr 18 02:54:30.881556 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:30.881521 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" path="/var/lib/kubelet/pods/09f6f38f-15fa-44ca-956c-1a8a59a37f30/volumes" Apr 18 02:54:52.503900 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.503825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:52.504358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.504145 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" Apr 18 02:54:52.504358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.504156 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" Apr 18 02:54:52.504358 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.504214 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="09f6f38f-15fa-44ca-956c-1a8a59a37f30" containerName="istio-proxy" Apr 18 02:54:52.508723 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.508699 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:52.511139 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.511108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 18 02:54:52.511283 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.511181 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 18 02:54:52.512020 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.512004 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-txzkk\"" Apr 18 02:54:52.515440 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.515421 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:52.663015 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.662957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64b4\" (UniqueName: \"kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4\") pod \"kuadrant-operator-catalog-bttkd\" (UID: \"6eb607e8-8d41-4bad-b84d-7de0da7c83c3\") " pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:52.763754 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.763645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t64b4\" (UniqueName: \"kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4\") pod \"kuadrant-operator-catalog-bttkd\" (UID: \"6eb607e8-8d41-4bad-b84d-7de0da7c83c3\") " pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:52.771263 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.771241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64b4\" (UniqueName: \"kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4\") pod \"kuadrant-operator-catalog-bttkd\" (UID: \"6eb607e8-8d41-4bad-b84d-7de0da7c83c3\") " pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:52.821369 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.821332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:52.876220 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.876163 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:52.947418 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:52.947374 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:52.949987 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:54:52.949959 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb607e8_8d41_4bad_b84d_7de0da7c83c3.slice/crio-1a2b689459942dc2257b7c8b7a46c9fe4585b3216a357be6d9e7de803c0f1904 WatchSource:0}: Error finding container 1a2b689459942dc2257b7c8b7a46c9fe4585b3216a357be6d9e7de803c0f1904: Status 404 returned error can't find the container with id 1a2b689459942dc2257b7c8b7a46c9fe4585b3216a357be6d9e7de803c0f1904 Apr 18 02:54:53.083187 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.083109 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pmz6q"] Apr 18 02:54:53.087687 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.087670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:54:53.091985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.091957 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pmz6q"] Apr 18 02:54:53.167496 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.167457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vdjr\" (UniqueName: \"kubernetes.io/projected/cfc76574-8ffc-4f6d-8e67-70e5f1404566-kube-api-access-2vdjr\") pod \"kuadrant-operator-catalog-pmz6q\" (UID: \"cfc76574-8ffc-4f6d-8e67-70e5f1404566\") " pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:54:53.268637 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.268599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vdjr\" (UniqueName: \"kubernetes.io/projected/cfc76574-8ffc-4f6d-8e67-70e5f1404566-kube-api-access-2vdjr\") pod \"kuadrant-operator-catalog-pmz6q\" (UID: \"cfc76574-8ffc-4f6d-8e67-70e5f1404566\") " pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:54:53.276459 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.276431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vdjr\" (UniqueName: \"kubernetes.io/projected/cfc76574-8ffc-4f6d-8e67-70e5f1404566-kube-api-access-2vdjr\") pod \"kuadrant-operator-catalog-pmz6q\" (UID: \"cfc76574-8ffc-4f6d-8e67-70e5f1404566\") " pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:54:53.398564 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.398465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:54:53.509512 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.509478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" event={"ID":"6eb607e8-8d41-4bad-b84d-7de0da7c83c3","Type":"ContainerStarted","Data":"1a2b689459942dc2257b7c8b7a46c9fe4585b3216a357be6d9e7de803c0f1904"} Apr 18 02:54:53.518079 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:53.518051 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pmz6q"] Apr 18 02:54:53.566010 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:54:53.565976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfc76574_8ffc_4f6d_8e67_70e5f1404566.slice/crio-28b870de6786cdc3c6e242b67bf9461828edb81d9e2b9a4321deb3651ca95fd1 WatchSource:0}: Error finding container 28b870de6786cdc3c6e242b67bf9461828edb81d9e2b9a4321deb3651ca95fd1: Status 404 returned error can't find the container with id 28b870de6786cdc3c6e242b67bf9461828edb81d9e2b9a4321deb3651ca95fd1 Apr 18 02:54:54.513552 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:54.513513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" event={"ID":"cfc76574-8ffc-4f6d-8e67-70e5f1404566","Type":"ContainerStarted","Data":"28b870de6786cdc3c6e242b67bf9461828edb81d9e2b9a4321deb3651ca95fd1"} Apr 18 02:54:55.523757 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.523722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" event={"ID":"cfc76574-8ffc-4f6d-8e67-70e5f1404566","Type":"ContainerStarted","Data":"ddf221857b82ee39184559a9871575db1bdc8171082a1e0c78dd3678d8b96a83"} Apr 18 02:54:55.524927 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.524906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" event={"ID":"6eb607e8-8d41-4bad-b84d-7de0da7c83c3","Type":"ContainerStarted","Data":"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5"} Apr 18 02:54:55.525034 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.525002 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" podUID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" containerName="registry-server" containerID="cri-o://83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5" gracePeriod=2 Apr 18 02:54:55.539801 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.539750 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" podStartSLOduration=1.175497237 podStartE2EDuration="2.539732723s" podCreationTimestamp="2026-04-18 02:54:53 +0000 UTC" firstStartedPulling="2026-04-18 02:54:53.567276553 +0000 UTC m=+531.290830426" lastFinishedPulling="2026-04-18 02:54:54.931512036 +0000 UTC m=+532.655065912" observedRunningTime="2026-04-18 02:54:55.537181645 +0000 UTC m=+533.260735545" watchObservedRunningTime="2026-04-18 02:54:55.539732723 +0000 UTC m=+533.263286622" Apr 18 02:54:55.550681 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.550636 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" podStartSLOduration=1.571246688 podStartE2EDuration="3.55062291s" podCreationTimestamp="2026-04-18 02:54:52 +0000 UTC" firstStartedPulling="2026-04-18 02:54:52.951600096 +0000 UTC m=+530.675153984" lastFinishedPulling="2026-04-18 02:54:54.930976333 +0000 UTC m=+532.654530206" observedRunningTime="2026-04-18 02:54:55.549683967 +0000 UTC m=+533.273237887" watchObservedRunningTime="2026-04-18 02:54:55.55062291 +0000 UTC m=+533.274176814" Apr 18 02:54:55.760273 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.760246 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:55.890253 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.890164 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64b4\" (UniqueName: \"kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4\") pod \"6eb607e8-8d41-4bad-b84d-7de0da7c83c3\" (UID: \"6eb607e8-8d41-4bad-b84d-7de0da7c83c3\") " Apr 18 02:54:55.892312 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.892282 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4" (OuterVolumeSpecName: "kube-api-access-t64b4") pod "6eb607e8-8d41-4bad-b84d-7de0da7c83c3" (UID: "6eb607e8-8d41-4bad-b84d-7de0da7c83c3"). InnerVolumeSpecName "kube-api-access-t64b4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:54:55.991436 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:55.991403 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t64b4\" (UniqueName: \"kubernetes.io/projected/6eb607e8-8d41-4bad-b84d-7de0da7c83c3-kube-api-access-t64b4\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:54:56.528675 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.528638 2574 generic.go:358] "Generic (PLEG): container finished" podID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" containerID="83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5" exitCode=0 Apr 18 02:54:56.529144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.528701 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" Apr 18 02:54:56.529144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.528723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" event={"ID":"6eb607e8-8d41-4bad-b84d-7de0da7c83c3","Type":"ContainerDied","Data":"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5"} Apr 18 02:54:56.529144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.528758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bttkd" event={"ID":"6eb607e8-8d41-4bad-b84d-7de0da7c83c3","Type":"ContainerDied","Data":"1a2b689459942dc2257b7c8b7a46c9fe4585b3216a357be6d9e7de803c0f1904"} Apr 18 02:54:56.529144 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.528775 2574 scope.go:117] "RemoveContainer" containerID="83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5" Apr 18 02:54:56.537236 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.537222 2574 scope.go:117] "RemoveContainer" containerID="83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5" Apr 18 02:54:56.537517 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:54:56.537499 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5\": container with ID starting with 83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5 not found: ID does not exist" containerID="83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5" Apr 18 02:54:56.537583 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.537526 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5"} err="failed to get container status \"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5\": rpc error: code = NotFound desc = could not find container \"83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5\": container with ID starting with 83cf5afc5ad0ee3278c13167aef9fb461adf8d0c5c6ebe0725a828b8736f2fe5 not found: ID does not exist" Apr 18 02:54:56.547904 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.547883 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:56.551793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.551771 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bttkd"] Apr 18 02:54:56.882932 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:54:56.882855 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" path="/var/lib/kubelet/pods/6eb607e8-8d41-4bad-b84d-7de0da7c83c3/volumes" Apr 18 02:55:03.398966 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:03.398927 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:55:03.398966 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:03.398975 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:55:03.420617 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:03.420583 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:55:03.573874 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:03.573847 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-pmz6q" Apr 18 02:55:25.382279 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.382243 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk"] Apr 18 02:55:25.382760 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.382607 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" containerName="registry-server" Apr 18 02:55:25.382760 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.382620 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" containerName="registry-server" Apr 18 02:55:25.382760 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.382683 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6eb607e8-8d41-4bad-b84d-7de0da7c83c3" containerName="registry-server" Apr 18 02:55:25.386234 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.386219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.388778 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.388757 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-9rlsg\"" Apr 18 02:55:25.395718 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.395693 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk"] Apr 18 02:55:25.429077 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.429047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.429237 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.429104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkbv7\" (UniqueName: \"kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.529739 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.529686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.529922 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.529773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkbv7\" (UniqueName: \"kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.530082 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.530060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.542502 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.542467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkbv7\" (UniqueName: \"kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.697331 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.697224 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:25.816497 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:25.816467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk"] Apr 18 02:55:25.819153 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:55:25.819119 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc681ed5c_c5f8_409d_9767_ae84b79c707f.slice/crio-8a631764962793eb4290d6046b87cd4a364904d1d237bca1eaa4dbe8c8ea8d7d WatchSource:0}: Error finding container 8a631764962793eb4290d6046b87cd4a364904d1d237bca1eaa4dbe8c8ea8d7d: Status 404 returned error can't find the container with id 8a631764962793eb4290d6046b87cd4a364904d1d237bca1eaa4dbe8c8ea8d7d Apr 18 02:55:26.629431 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:26.629390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" event={"ID":"c681ed5c-c5f8-409d-9767-ae84b79c707f","Type":"ContainerStarted","Data":"8a631764962793eb4290d6046b87cd4a364904d1d237bca1eaa4dbe8c8ea8d7d"} Apr 18 02:55:31.648575 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:31.648533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" event={"ID":"c681ed5c-c5f8-409d-9767-ae84b79c707f","Type":"ContainerStarted","Data":"f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6"} Apr 18 02:55:31.648985 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:31.648662 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:31.669562 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:31.669492 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" podStartSLOduration=1.319689872 podStartE2EDuration="6.669473028s" podCreationTimestamp="2026-04-18 02:55:25 +0000 UTC" firstStartedPulling="2026-04-18 02:55:25.82137294 +0000 UTC m=+563.544926813" lastFinishedPulling="2026-04-18 02:55:31.171156096 +0000 UTC m=+568.894709969" observedRunningTime="2026-04-18 02:55:31.667661877 +0000 UTC m=+569.391215772" watchObservedRunningTime="2026-04-18 02:55:31.669473028 +0000 UTC m=+569.393026925" Apr 18 02:55:33.003092 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.003035 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx"] Apr 18 02:55:33.005850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.005828 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:33.008375 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.008348 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-dtcdq\"" Apr 18 02:55:33.017058 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.017035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx"] Apr 18 02:55:33.094269 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.094231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctdhb\" (UniqueName: \"kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb\") pod \"limitador-operator-controller-manager-85c4996f8c-99qwx\" (UID: \"17489be0-da18-4953-939a-b1b2628b8433\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:33.195032 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.194990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctdhb\" (UniqueName: \"kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb\") pod \"limitador-operator-controller-manager-85c4996f8c-99qwx\" (UID: \"17489be0-da18-4953-939a-b1b2628b8433\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:33.204434 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.204406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctdhb\" (UniqueName: \"kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb\") pod \"limitador-operator-controller-manager-85c4996f8c-99qwx\" (UID: \"17489be0-da18-4953-939a-b1b2628b8433\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:33.315428 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.315332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:33.440563 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.440537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx"] Apr 18 02:55:33.443035 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:55:33.443004 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17489be0_da18_4953_939a_b1b2628b8433.slice/crio-2826c8fa1b08b6133b04b90bf0fd9775093e2294a6161ac371ef85ba960f827b WatchSource:0}: Error finding container 2826c8fa1b08b6133b04b90bf0fd9775093e2294a6161ac371ef85ba960f827b: Status 404 returned error can't find the container with id 2826c8fa1b08b6133b04b90bf0fd9775093e2294a6161ac371ef85ba960f827b Apr 18 02:55:33.657272 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:33.657182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" event={"ID":"17489be0-da18-4953-939a-b1b2628b8433","Type":"ContainerStarted","Data":"2826c8fa1b08b6133b04b90bf0fd9775093e2294a6161ac371ef85ba960f827b"} Apr 18 02:55:35.665251 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:35.665213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" event={"ID":"17489be0-da18-4953-939a-b1b2628b8433","Type":"ContainerStarted","Data":"2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54"} Apr 18 02:55:35.665667 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:35.665335 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:35.681733 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:35.681687 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" podStartSLOduration=2.109331938 podStartE2EDuration="3.68167318s" podCreationTimestamp="2026-04-18 02:55:32 +0000 UTC" firstStartedPulling="2026-04-18 02:55:33.444975157 +0000 UTC m=+571.168529034" lastFinishedPulling="2026-04-18 02:55:35.017316399 +0000 UTC m=+572.740870276" observedRunningTime="2026-04-18 02:55:35.680191789 +0000 UTC m=+573.403745705" watchObservedRunningTime="2026-04-18 02:55:35.68167318 +0000 UTC m=+573.405227074" Apr 18 02:55:42.655392 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:42.655354 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:44.345831 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.345796 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk"] Apr 18 02:55:44.346219 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.346016 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" containerName="manager" containerID="cri-o://f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6" gracePeriod=2 Apr 18 02:55:44.350843 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.350818 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk"] Apr 18 02:55:44.369757 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.369731 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx"] Apr 18 02:55:44.370138 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.370036 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" podUID="17489be0-da18-4953-939a-b1b2628b8433" containerName="manager" containerID="cri-o://2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54" gracePeriod=2 Apr 18 02:55:44.371719 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.371674 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:44.372179 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.372129 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.372315 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.372161 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" containerName="manager" Apr 18 02:55:44.372395 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.372335 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" containerName="manager" Apr 18 02:55:44.372503 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.372479 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" containerName="manager" Apr 18 02:55:44.374210 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.374178 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.375992 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.375967 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:44.375992 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.375992 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx"] Apr 18 02:55:44.376131 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.376112 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.377823 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.377796 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.380080 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.380051 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.382211 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.382184 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.384393 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.384369 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.384872 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.384848 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:44.391457 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.391437 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz"] Apr 18 02:55:44.391821 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.391802 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17489be0-da18-4953-939a-b1b2628b8433" containerName="manager" Apr 18 02:55:44.391821 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.391821 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="17489be0-da18-4953-939a-b1b2628b8433" containerName="manager" Apr 18 02:55:44.391921 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.391893 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="17489be0-da18-4953-939a-b1b2628b8433" containerName="manager" Apr 18 02:55:44.394683 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.394668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:44.404535 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.404495 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz"] Apr 18 02:55:44.407642 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.407595 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.430793 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.430764 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.493196 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.493166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.493363 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.493216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b5gs\" (UniqueName: \"kubernetes.io/projected/aa80f70b-bc36-4170-ad0f-e28dd15fbe25-kube-api-access-2b5gs\") pod \"limitador-operator-controller-manager-85c4996f8c-j56fz\" (UID: \"aa80f70b-bc36-4170-ad0f-e28dd15fbe25\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:44.493443 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.493366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxns\" (UniqueName: \"kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.578782 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.578758 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 02:55:44.581953 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.581936 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.591312 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.591280 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 02:55:44.594674 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.594648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mxns\" (UniqueName: \"kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.594807 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.594731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.594807 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.594776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b5gs\" (UniqueName: \"kubernetes.io/projected/aa80f70b-bc36-4170-ad0f-e28dd15fbe25-kube-api-access-2b5gs\") pod \"limitador-operator-controller-manager-85c4996f8c-j56fz\" (UID: \"aa80f70b-bc36-4170-ad0f-e28dd15fbe25\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:44.595106 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.595086 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.603130 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.603056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mxns\" (UniqueName: \"kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-s6gxg\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.603601 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.603581 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b5gs\" (UniqueName: \"kubernetes.io/projected/aa80f70b-bc36-4170-ad0f-e28dd15fbe25-kube-api-access-2b5gs\") pod \"limitador-operator-controller-manager-85c4996f8c-j56fz\" (UID: \"aa80f70b-bc36-4170-ad0f-e28dd15fbe25\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:44.604111 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.604096 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:44.608578 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.608550 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.610440 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.610414 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.612255 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.612237 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.614258 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.614234 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.615700 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.615686 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:44.617955 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.617935 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.619904 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.619884 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.695328 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695284 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkbv7\" (UniqueName: \"kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7\") pod \"c681ed5c-c5f8-409d-9767-ae84b79c707f\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " Apr 18 02:55:44.695527 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695340 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctdhb\" (UniqueName: \"kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb\") pod \"17489be0-da18-4953-939a-b1b2628b8433\" (UID: \"17489be0-da18-4953-939a-b1b2628b8433\") " Apr 18 02:55:44.695527 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695458 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume\") pod \"c681ed5c-c5f8-409d-9767-ae84b79c707f\" (UID: \"c681ed5c-c5f8-409d-9767-ae84b79c707f\") " Apr 18 02:55:44.695648 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.695701 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkrp\" (UniqueName: \"kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.696008 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.695975 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c681ed5c-c5f8-409d-9767-ae84b79c707f" (UID: "c681ed5c-c5f8-409d-9767-ae84b79c707f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:44.696585 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.696557 2574 generic.go:358] "Generic (PLEG): container finished" podID="c681ed5c-c5f8-409d-9767-ae84b79c707f" containerID="f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6" exitCode=0 Apr 18 02:55:44.696685 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.696610 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" Apr 18 02:55:44.696685 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.696639 2574 scope.go:117] "RemoveContainer" containerID="f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6" Apr 18 02:55:44.697818 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.697796 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7" (OuterVolumeSpecName: "kube-api-access-mkbv7") pod "c681ed5c-c5f8-409d-9767-ae84b79c707f" (UID: "c681ed5c-c5f8-409d-9767-ae84b79c707f"). InnerVolumeSpecName "kube-api-access-mkbv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:44.697890 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.697825 2574 generic.go:358] "Generic (PLEG): container finished" podID="17489be0-da18-4953-939a-b1b2628b8433" containerID="2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54" exitCode=0 Apr 18 02:55:44.697933 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.697825 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb" (OuterVolumeSpecName: "kube-api-access-ctdhb") pod "17489be0-da18-4953-939a-b1b2628b8433" (UID: "17489be0-da18-4953-939a-b1b2628b8433"). InnerVolumeSpecName "kube-api-access-ctdhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:44.697933 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.697895 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" Apr 18 02:55:44.698972 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.698949 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.703828 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.703796 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.705451 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.705435 2574 scope.go:117] "RemoveContainer" containerID="f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6" Apr 18 02:55:44.705766 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:55:44.705737 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6\": container with ID starting with f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6 not found: ID does not exist" containerID="f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6" Apr 18 02:55:44.705766 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.705759 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.705901 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.705776 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6"} err="failed to get container status \"f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6\": rpc error: code = NotFound desc = could not find container \"f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6\": container with ID starting with f0d32c6e14013ae35420718791597d0302b8042ad977187b39b076f4062361a6 not found: ID does not exist" Apr 18 02:55:44.705901 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.705798 2574 scope.go:117] "RemoveContainer" containerID="2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54" Apr 18 02:55:44.707670 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.707649 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.709522 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.709502 2574 status_manager.go:895] "Failed to get status for pod" podUID="17489be0-da18-4953-939a-b1b2628b8433" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-99qwx" err="pods \"limitador-operator-controller-manager-85c4996f8c-99qwx\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.711371 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.711343 2574 status_manager.go:895] "Failed to get status for pod" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8g2rk" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8g2rk\" is forbidden: User \"system:node:ip-10-0-129-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-229.ec2.internal' and this object" Apr 18 02:55:44.712933 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.712914 2574 scope.go:117] "RemoveContainer" containerID="2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54" Apr 18 02:55:44.713188 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:55:44.713172 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54\": container with ID starting with 2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54 not found: ID does not exist" containerID="2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54" Apr 18 02:55:44.713251 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.713191 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54"} err="failed to get container status \"2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54\": rpc error: code = NotFound desc = could not find container \"2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54\": container with ID starting with 2fb8af438f393787c7c121f6567b1322fb4fa48bdda71f117ba3ec47a3154c54 not found: ID does not exist" Apr 18 02:55:44.767930 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.767889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:44.773704 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.773685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:44.796619 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.796587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.796770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.796642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkrp\" (UniqueName: \"kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.796770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.796722 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c681ed5c-c5f8-409d-9767-ae84b79c707f-extensions-socket-volume\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:55:44.796770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.796733 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkbv7\" (UniqueName: \"kubernetes.io/projected/c681ed5c-c5f8-409d-9767-ae84b79c707f-kube-api-access-mkbv7\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:55:44.796770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.796743 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctdhb\" (UniqueName: \"kubernetes.io/projected/17489be0-da18-4953-939a-b1b2628b8433-kube-api-access-ctdhb\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:55:44.797048 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.797024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.817167 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.817126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkrp\" (UniqueName: \"kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-x6dz9\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.887086 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.884653 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17489be0-da18-4953-939a-b1b2628b8433" path="/var/lib/kubelet/pods/17489be0-da18-4953-939a-b1b2628b8433/volumes" Apr 18 02:55:44.887086 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.885198 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c681ed5c-c5f8-409d-9767-ae84b79c707f" path="/var/lib/kubelet/pods/c681ed5c-c5f8-409d-9767-ae84b79c707f/volumes" Apr 18 02:55:44.914665 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.914636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:44.939781 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.939758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:44.943159 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:55:44.943128 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3281ebac_ec5f_4871_9460_5904d82b446c.slice/crio-fc03a79d97f1fc720a5ef75f494447b53e6a2d49bfe24f75baf749bc43fa8704 WatchSource:0}: Error finding container fc03a79d97f1fc720a5ef75f494447b53e6a2d49bfe24f75baf749bc43fa8704: Status 404 returned error can't find the container with id fc03a79d97f1fc720a5ef75f494447b53e6a2d49bfe24f75baf749bc43fa8704 Apr 18 02:55:44.952859 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:44.952836 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz"] Apr 18 02:55:44.957982 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:55:44.957954 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa80f70b_bc36_4170_ad0f_e28dd15fbe25.slice/crio-41f32527a13334bd811fad1d5fcb1d8b81554b68696b81bff4b8a4f49f859cfa WatchSource:0}: Error finding container 41f32527a13334bd811fad1d5fcb1d8b81554b68696b81bff4b8a4f49f859cfa: Status 404 returned error can't find the container with id 41f32527a13334bd811fad1d5fcb1d8b81554b68696b81bff4b8a4f49f859cfa Apr 18 02:55:45.058045 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.058020 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 02:55:45.060542 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:55:45.060515 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9190b471_453a_42f6_a49d_75f83d496d3f.slice/crio-8542cb80dce4a56a615c76fc15c41e6063e85e446f831a42969192e26e808df5 WatchSource:0}: Error finding container 8542cb80dce4a56a615c76fc15c41e6063e85e446f831a42969192e26e808df5: Status 404 returned error can't find the container with id 8542cb80dce4a56a615c76fc15c41e6063e85e446f831a42969192e26e808df5 Apr 18 02:55:45.702155 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.702058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" event={"ID":"aa80f70b-bc36-4170-ad0f-e28dd15fbe25","Type":"ContainerStarted","Data":"a1ab55e77f3c7cd5a8e9a8482f218879f1225237538c5647de92249c40f2dd71"} Apr 18 02:55:45.702155 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.702103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" event={"ID":"aa80f70b-bc36-4170-ad0f-e28dd15fbe25","Type":"ContainerStarted","Data":"41f32527a13334bd811fad1d5fcb1d8b81554b68696b81bff4b8a4f49f859cfa"} Apr 18 02:55:45.702155 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.702149 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:45.703455 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.703429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" event={"ID":"3281ebac-ec5f-4871-9460-5904d82b446c","Type":"ContainerStarted","Data":"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d"} Apr 18 02:55:45.703455 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.703457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" event={"ID":"3281ebac-ec5f-4871-9460-5904d82b446c","Type":"ContainerStarted","Data":"fc03a79d97f1fc720a5ef75f494447b53e6a2d49bfe24f75baf749bc43fa8704"} Apr 18 02:55:45.703628 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.703486 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:45.706264 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.706244 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" event={"ID":"9190b471-453a-42f6-a49d-75f83d496d3f","Type":"ContainerStarted","Data":"da9016e47858ed993c14a05f548cb0c6168c6d30ec600de6ea26d0cd359e0203"} Apr 18 02:55:45.706400 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.706267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" event={"ID":"9190b471-453a-42f6-a49d-75f83d496d3f","Type":"ContainerStarted","Data":"8542cb80dce4a56a615c76fc15c41e6063e85e446f831a42969192e26e808df5"} Apr 18 02:55:45.735819 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.735776 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" podStartSLOduration=1.735759789 podStartE2EDuration="1.735759789s" podCreationTimestamp="2026-04-18 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:55:45.734629449 +0000 UTC m=+583.458183340" watchObservedRunningTime="2026-04-18 02:55:45.735759789 +0000 UTC m=+583.459313685" Apr 18 02:55:45.736140 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:45.736118 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" podStartSLOduration=1.736112235 podStartE2EDuration="1.736112235s" podCreationTimestamp="2026-04-18 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:55:45.717220156 +0000 UTC m=+583.440774054" watchObservedRunningTime="2026-04-18 02:55:45.736112235 +0000 UTC m=+583.459666127" Apr 18 02:55:46.710658 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:46.710615 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:56.712875 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:56.712843 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j56fz" Apr 18 02:55:56.713250 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:56.713021 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:56.729408 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:56.729362 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" podStartSLOduration=12.729347063 podStartE2EDuration="12.729347063s" podCreationTimestamp="2026-04-18 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 02:55:45.75628324 +0000 UTC m=+583.479837135" watchObservedRunningTime="2026-04-18 02:55:56.729347063 +0000 UTC m=+594.452900958" Apr 18 02:55:57.716671 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:57.716634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 02:55:57.766480 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:57.766447 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:57.766672 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:57.766644 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" podUID="3281ebac-ec5f-4871-9460-5904d82b446c" containerName="manager" containerID="cri-o://efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d" gracePeriod=10 Apr 18 02:55:58.011069 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.011047 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:58.110781 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.110744 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume\") pod \"3281ebac-ec5f-4871-9460-5904d82b446c\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " Apr 18 02:55:58.110977 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.110846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mxns\" (UniqueName: \"kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns\") pod \"3281ebac-ec5f-4871-9460-5904d82b446c\" (UID: \"3281ebac-ec5f-4871-9460-5904d82b446c\") " Apr 18 02:55:58.111191 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.111166 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3281ebac-ec5f-4871-9460-5904d82b446c" (UID: "3281ebac-ec5f-4871-9460-5904d82b446c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 02:55:58.112861 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.112839 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns" (OuterVolumeSpecName: "kube-api-access-8mxns") pod "3281ebac-ec5f-4871-9460-5904d82b446c" (UID: "3281ebac-ec5f-4871-9460-5904d82b446c"). InnerVolumeSpecName "kube-api-access-8mxns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 02:55:58.211445 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.211405 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mxns\" (UniqueName: \"kubernetes.io/projected/3281ebac-ec5f-4871-9460-5904d82b446c-kube-api-access-8mxns\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:55:58.211445 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.211438 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3281ebac-ec5f-4871-9460-5904d82b446c-extensions-socket-volume\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 02:55:58.752394 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.752357 2574 generic.go:358] "Generic (PLEG): container finished" podID="3281ebac-ec5f-4871-9460-5904d82b446c" containerID="efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d" exitCode=0 Apr 18 02:55:58.752770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.752424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" event={"ID":"3281ebac-ec5f-4871-9460-5904d82b446c","Type":"ContainerDied","Data":"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d"} Apr 18 02:55:58.752770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.752457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" event={"ID":"3281ebac-ec5f-4871-9460-5904d82b446c","Type":"ContainerDied","Data":"fc03a79d97f1fc720a5ef75f494447b53e6a2d49bfe24f75baf749bc43fa8704"} Apr 18 02:55:58.752770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.752458 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg" Apr 18 02:55:58.752770 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.752472 2574 scope.go:117] "RemoveContainer" containerID="efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d" Apr 18 02:55:58.760865 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.760847 2574 scope.go:117] "RemoveContainer" containerID="efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d" Apr 18 02:55:58.761110 ip-10-0-129-229 kubenswrapper[2574]: E0418 02:55:58.761090 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d\": container with ID starting with efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d not found: ID does not exist" containerID="efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d" Apr 18 02:55:58.761156 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.761120 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d"} err="failed to get container status \"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d\": rpc error: code = NotFound desc = could not find container \"efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d\": container with ID starting with efd66f8a5d707defc63ff0a7f0871742c14d7bd160ba9e3eb133ebdd425cfc6d not found: ID does not exist" Apr 18 02:55:58.772865 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.772840 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:58.776073 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.776053 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-s6gxg"] Apr 18 02:55:58.882635 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:55:58.882603 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3281ebac-ec5f-4871-9460-5904d82b446c" path="/var/lib/kubelet/pods/3281ebac-ec5f-4871-9460-5904d82b446c/volumes" Apr 18 02:56:02.803489 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:02.803456 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:56:02.805186 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:02.805157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 02:56:18.462813 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.462777 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:56:18.463206 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.463104 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3281ebac-ec5f-4871-9460-5904d82b446c" containerName="manager" Apr 18 02:56:18.463206 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.463114 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3281ebac-ec5f-4871-9460-5904d82b446c" containerName="manager" Apr 18 02:56:18.463367 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.463229 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3281ebac-ec5f-4871-9460-5904d82b446c" containerName="manager" Apr 18 02:56:18.465947 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.465930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.468387 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.468369 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 18 02:56:18.468493 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.468422 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b57hm\"" Apr 18 02:56:18.473584 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.473563 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:56:18.498163 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.498129 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:56:18.588728 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.588691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-config-file\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.588898 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.588752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwgr\" (UniqueName: \"kubernetes.io/projected/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-kube-api-access-6fwgr\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.689916 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.689882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwgr\" (UniqueName: \"kubernetes.io/projected/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-kube-api-access-6fwgr\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.690096 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.689948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-config-file\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.690555 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.690534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-config-file\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.698066 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.698036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwgr\" (UniqueName: \"kubernetes.io/projected/3a93b67f-aafb-45b2-a0d9-c65ab405c6e4-kube-api-access-6fwgr\") pod \"limitador-limitador-78c99df468-mssnc\" (UID: \"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4\") " pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.776849 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.776813 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:18.894145 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:18.894070 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:56:18.897036 ip-10-0-129-229 kubenswrapper[2574]: W0418 02:56:18.897010 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a93b67f_aafb_45b2_a0d9_c65ab405c6e4.slice/crio-93ff835f45af217a5ea268f34886da8cc2fdcc73eec64df835ffd76e868b08eb WatchSource:0}: Error finding container 93ff835f45af217a5ea268f34886da8cc2fdcc73eec64df835ffd76e868b08eb: Status 404 returned error can't find the container with id 93ff835f45af217a5ea268f34886da8cc2fdcc73eec64df835ffd76e868b08eb Apr 18 02:56:19.821764 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:19.821726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" event={"ID":"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4","Type":"ContainerStarted","Data":"93ff835f45af217a5ea268f34886da8cc2fdcc73eec64df835ffd76e868b08eb"} Apr 18 02:56:21.829842 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:21.829810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" event={"ID":"3a93b67f-aafb-45b2-a0d9-c65ab405c6e4","Type":"ContainerStarted","Data":"e3799e1fa02d89a27de8cae95c256d3011c1890ae76a8417b1b3e7fb6edc4a58"} Apr 18 02:56:21.830225 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:21.829929 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:21.849372 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:21.849318 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" podStartSLOduration=1.397563954 podStartE2EDuration="3.849287889s" podCreationTimestamp="2026-04-18 02:56:18 +0000 UTC" firstStartedPulling="2026-04-18 02:56:18.898654571 +0000 UTC m=+616.622208444" lastFinishedPulling="2026-04-18 02:56:21.350378502 +0000 UTC m=+619.073932379" observedRunningTime="2026-04-18 02:56:21.847931751 +0000 UTC m=+619.571485646" watchObservedRunningTime="2026-04-18 02:56:21.849287889 +0000 UTC m=+619.572841783" Apr 18 02:56:32.836472 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:32.836436 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-mssnc" Apr 18 02:56:55.295947 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:56:55.295907 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:57:39.891990 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:57:39.891899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:57:48.394399 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:57:48.394314 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:57:52.686272 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:57:52.686236 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:57:56.085867 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:57:56.085829 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:58:17.388430 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:58:17.388390 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:58:44.399501 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:58:44.399469 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:59:25.388850 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:59:25.388754 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:59:29.588931 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:59:29.588897 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:59:37.287055 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:59:37.287016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:59:47.392525 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:59:47.392484 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 02:59:55.991460 ip-10-0-129-229 kubenswrapper[2574]: I0418 02:59:55.991424 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:00:00.141362 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.141322 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:00:00.144269 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.144251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:00:00.146559 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.146538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-j9kkg\"" Apr 18 03:00:00.149529 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.149505 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:00:00.241780 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.241741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7b5\" (UniqueName: \"kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5\") pod \"maas-api-key-cleanup-29608020-66vtv\" (UID: \"d9b1c5a7-bbe5-4671-9878-5e76e01166d5\") " pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:00:00.342398 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.342364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7b5\" (UniqueName: \"kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5\") pod \"maas-api-key-cleanup-29608020-66vtv\" (UID: \"d9b1c5a7-bbe5-4671-9878-5e76e01166d5\") " pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:00:00.350663 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.350628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7b5\" (UniqueName: \"kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5\") pod \"maas-api-key-cleanup-29608020-66vtv\" (UID: \"d9b1c5a7-bbe5-4671-9878-5e76e01166d5\") " pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:00:00.456089 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.455989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:00:00.781617 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.781584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:00:00.784688 ip-10-0-129-229 kubenswrapper[2574]: W0418 03:00:00.784654 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b1c5a7_bbe5_4671_9878_5e76e01166d5.slice/crio-0a801d481720bc69e9035446150bf932e3b102514131b28f0cc71ae38b356667 WatchSource:0}: Error finding container 0a801d481720bc69e9035446150bf932e3b102514131b28f0cc71ae38b356667: Status 404 returned error can't find the container with id 0a801d481720bc69e9035446150bf932e3b102514131b28f0cc71ae38b356667 Apr 18 03:00:00.786335 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:00.786295 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 03:00:01.549393 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:01.549355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerStarted","Data":"0a801d481720bc69e9035446150bf932e3b102514131b28f0cc71ae38b356667"} Apr 18 03:00:04.563636 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:04.563586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerStarted","Data":"478f1fbf7bcbe43435f9b3b1058dfd70d7cecd3b3fb1ccc4f9cd9d092fec529a"} Apr 18 03:00:04.577745 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:04.577701 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" podStartSLOduration=1.6084741679999999 podStartE2EDuration="4.577686492s" podCreationTimestamp="2026-04-18 03:00:00 +0000 UTC" firstStartedPulling="2026-04-18 03:00:00.786445121 +0000 UTC m=+838.509998994" lastFinishedPulling="2026-04-18 03:00:03.755657446 +0000 UTC m=+841.479211318" observedRunningTime="2026-04-18 03:00:04.576645555 +0000 UTC m=+842.300199451" watchObservedRunningTime="2026-04-18 03:00:04.577686492 +0000 UTC m=+842.301240386" Apr 18 03:00:06.591100 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:06.591060 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:00:15.785717 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:15.785677 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:00:24.629095 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:24.629011 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerID="478f1fbf7bcbe43435f9b3b1058dfd70d7cecd3b3fb1ccc4f9cd9d092fec529a" exitCode=6 Apr 18 03:00:24.629095 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:24.629076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerDied","Data":"478f1fbf7bcbe43435f9b3b1058dfd70d7cecd3b3fb1ccc4f9cd9d092fec529a"} Apr 18 03:00:24.629546 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:24.629396 2574 scope.go:117] "RemoveContainer" containerID="478f1fbf7bcbe43435f9b3b1058dfd70d7cecd3b3fb1ccc4f9cd9d092fec529a" Apr 18 03:00:25.633545 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:25.633512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerStarted","Data":"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175"} Apr 18 03:00:26.285105 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:26.285066 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:00:45.705112 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:45.705021 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" exitCode=6 Apr 18 03:00:45.705112 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:45.705091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerDied","Data":"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175"} Apr 18 03:00:45.705635 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:45.705139 2574 scope.go:117] "RemoveContainer" containerID="478f1fbf7bcbe43435f9b3b1058dfd70d7cecd3b3fb1ccc4f9cd9d092fec529a" Apr 18 03:00:45.705635 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:45.705468 2574 scope.go:117] "RemoveContainer" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" Apr 18 03:00:45.705944 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:00:45.705684 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29608020-66vtv_opendatahub(d9b1c5a7-bbe5-4671-9878-5e76e01166d5)\"" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" Apr 18 03:00:58.877665 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:58.877629 2574 scope.go:117] "RemoveContainer" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" Apr 18 03:00:59.754426 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:59.754391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerStarted","Data":"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39"} Apr 18 03:00:59.901902 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:00:59.901866 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:01:00.757736 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:00.757698 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" containerID="cri-o://94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39" gracePeriod=30 Apr 18 03:01:02.827512 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:02.827477 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:01:02.830205 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:02.830181 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:01:19.589363 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.589334 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:01:19.704162 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.704128 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc7b5\" (UniqueName: \"kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5\") pod \"d9b1c5a7-bbe5-4671-9878-5e76e01166d5\" (UID: \"d9b1c5a7-bbe5-4671-9878-5e76e01166d5\") " Apr 18 03:01:19.706287 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.706256 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5" (OuterVolumeSpecName: "kube-api-access-dc7b5") pod "d9b1c5a7-bbe5-4671-9878-5e76e01166d5" (UID: "d9b1c5a7-bbe5-4671-9878-5e76e01166d5"). InnerVolumeSpecName "kube-api-access-dc7b5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 03:01:19.805273 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.805236 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dc7b5\" (UniqueName: \"kubernetes.io/projected/d9b1c5a7-bbe5-4671-9878-5e76e01166d5-kube-api-access-dc7b5\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 03:01:19.821158 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.821126 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerID="94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39" exitCode=6 Apr 18 03:01:19.821333 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.821186 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" Apr 18 03:01:19.821333 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.821212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerDied","Data":"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39"} Apr 18 03:01:19.821333 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.821245 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608020-66vtv" event={"ID":"d9b1c5a7-bbe5-4671-9878-5e76e01166d5","Type":"ContainerDied","Data":"0a801d481720bc69e9035446150bf932e3b102514131b28f0cc71ae38b356667"} Apr 18 03:01:19.821333 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.821261 2574 scope.go:117] "RemoveContainer" containerID="94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39" Apr 18 03:01:19.829601 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.829585 2574 scope.go:117] "RemoveContainer" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" Apr 18 03:01:19.839234 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.839195 2574 scope.go:117] "RemoveContainer" containerID="94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39" Apr 18 03:01:19.839718 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:01:19.839567 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39\": container with ID starting with 94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39 not found: ID does not exist" containerID="94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39" Apr 18 03:01:19.839718 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.839609 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39"} err="failed to get container status \"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39\": rpc error: code = NotFound desc = could not find container \"94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39\": container with ID starting with 94a0270c66a58d1c4f536cfb6e0f30c64ec99e6089400c4c0b818f85ad1faa39 not found: ID does not exist" Apr 18 03:01:19.839718 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.839634 2574 scope.go:117] "RemoveContainer" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" Apr 18 03:01:19.840266 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:01:19.840201 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175\": container with ID starting with 9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175 not found: ID does not exist" containerID="9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175" Apr 18 03:01:19.840266 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.840234 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175"} err="failed to get container status \"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175\": rpc error: code = NotFound desc = could not find container \"9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175\": container with ID starting with 9095292332af85f1472de9d1740474d6ec88ff33d21b36de10f5828208910175 not found: ID does not exist" Apr 18 03:01:19.842166 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.842147 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:01:19.845360 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:19.845339 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608020-66vtv"] Apr 18 03:01:20.886685 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:20.886654 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" path="/var/lib/kubelet/pods/d9b1c5a7-bbe5-4671-9878-5e76e01166d5/volumes" Apr 18 03:01:26.694451 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:26.694411 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:01:42.189341 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:01:42.189288 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:02:20.491167 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:02:20.491084 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:02:37.689922 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:02:37.689883 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:02:41.889200 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:02:41.889160 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:02:51.188768 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:02:51.188734 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:03:07.689233 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:03:07.689196 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:03:32.992542 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:03:32.992499 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:03:37.084568 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:03:37.084531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:03:59.790158 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:03:59.790078 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:04:09.288978 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:04:09.288944 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:04:25.386431 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:04:25.386388 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:04:33.989102 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:04:33.989064 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:04:50.389019 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:04:50.388984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:04:58.989824 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:04:58.989786 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:05:32.191131 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:05:32.191036 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:05:39.686366 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:05:39.686331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:05:49.088132 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:05:49.088096 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:05:56.987520 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:05:56.987481 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:06:02.853781 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:06:02.853747 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:06:02.858284 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:06:02.858265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:06:05.192151 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:06:05.192109 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:06:22.195266 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:06:22.195232 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:06:33.189056 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:06:33.189020 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:07:20.598748 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:07:20.598715 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:07:28.587012 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:07:28.586974 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:07:37.781484 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:07:37.781446 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:07:45.992010 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:07:45.991968 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:07:54.585508 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:07:54.585468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:03.987218 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:03.987181 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:12.487337 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:12.487244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:20.391903 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:20.391863 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:29.291324 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:29.291282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:38.786362 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:38.786325 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:47.788011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:47.787977 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:08:55.894547 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:08:55.894510 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:04.791407 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:04.791371 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:12.586532 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:12.586497 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:22.579944 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:22.579904 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:30.391752 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:30.391717 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:40.192881 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:40.192844 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:09:47.691604 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:09:47.691518 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:10:39.444853 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.444813 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 03:10:39.445442 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.445094 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" podUID="9190b471-453a-42f6-a49d-75f83d496d3f" containerName="manager" containerID="cri-o://da9016e47858ed993c14a05f548cb0c6168c6d30ec600de6ea26d0cd359e0203" gracePeriod=10 Apr 18 03:10:39.665569 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.665536 2574 generic.go:358] "Generic (PLEG): container finished" podID="9190b471-453a-42f6-a49d-75f83d496d3f" containerID="da9016e47858ed993c14a05f548cb0c6168c6d30ec600de6ea26d0cd359e0203" exitCode=0 Apr 18 03:10:39.665737 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.665581 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" event={"ID":"9190b471-453a-42f6-a49d-75f83d496d3f","Type":"ContainerDied","Data":"da9016e47858ed993c14a05f548cb0c6168c6d30ec600de6ea26d0cd359e0203"} Apr 18 03:10:39.684604 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.684582 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 03:10:39.794647 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.794597 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume\") pod \"9190b471-453a-42f6-a49d-75f83d496d3f\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " Apr 18 03:10:39.794859 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.794688 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgkrp\" (UniqueName: \"kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp\") pod \"9190b471-453a-42f6-a49d-75f83d496d3f\" (UID: \"9190b471-453a-42f6-a49d-75f83d496d3f\") " Apr 18 03:10:39.795053 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.795025 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9190b471-453a-42f6-a49d-75f83d496d3f" (UID: "9190b471-453a-42f6-a49d-75f83d496d3f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 18 03:10:39.796707 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.796687 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp" (OuterVolumeSpecName: "kube-api-access-mgkrp") pod "9190b471-453a-42f6-a49d-75f83d496d3f" (UID: "9190b471-453a-42f6-a49d-75f83d496d3f"). InnerVolumeSpecName "kube-api-access-mgkrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 03:10:39.896081 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.896037 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9190b471-453a-42f6-a49d-75f83d496d3f-extensions-socket-volume\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 03:10:39.896081 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:39.896073 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgkrp\" (UniqueName: \"kubernetes.io/projected/9190b471-453a-42f6-a49d-75f83d496d3f-kube-api-access-mgkrp\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 03:10:40.669528 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.669496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" event={"ID":"9190b471-453a-42f6-a49d-75f83d496d3f","Type":"ContainerDied","Data":"8542cb80dce4a56a615c76fc15c41e6063e85e446f831a42969192e26e808df5"} Apr 18 03:10:40.669528 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.669514 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9" Apr 18 03:10:40.670001 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.669538 2574 scope.go:117] "RemoveContainer" containerID="da9016e47858ed993c14a05f548cb0c6168c6d30ec600de6ea26d0cd359e0203" Apr 18 03:10:40.689680 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.689653 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 03:10:40.695125 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.695105 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-x6dz9"] Apr 18 03:10:40.882622 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:10:40.882591 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9190b471-453a-42f6-a49d-75f83d496d3f" path="/var/lib/kubelet/pods/9190b471-453a-42f6-a49d-75f83d496d3f/volumes" Apr 18 03:11:02.875732 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:02.875699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:11:02.889186 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:02.889159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:11:45.545608 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.545520 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp"] Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.545962 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.545978 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.545987 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.545994 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546018 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9190b471-453a-42f6-a49d-75f83d496d3f" containerName="manager" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546024 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9190b471-453a-42f6-a49d-75f83d496d3f" containerName="manager" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546072 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546080 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9190b471-453a-42f6-a49d-75f83d496d3f" containerName="manager" Apr 18 03:11:45.546084 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546086 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.546417 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.546095 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:11:45.549055 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.549034 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.551951 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.551933 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-9rlsg\"" Apr 18 03:11:45.560398 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.560375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp"] Apr 18 03:11:45.648241 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.648203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadc18c4-d360-4730-ab18-0320e8c326a5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.648443 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.648254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snth4\" (UniqueName: \"kubernetes.io/projected/eadc18c4-d360-4730-ab18-0320e8c326a5-kube-api-access-snth4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.749433 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.749376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadc18c4-d360-4730-ab18-0320e8c326a5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.749433 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.749451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snth4\" (UniqueName: \"kubernetes.io/projected/eadc18c4-d360-4730-ab18-0320e8c326a5-kube-api-access-snth4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.749824 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.749802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadc18c4-d360-4730-ab18-0320e8c326a5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.765731 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.765707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snth4\" (UniqueName: \"kubernetes.io/projected/eadc18c4-d360-4730-ab18-0320e8c326a5-kube-api-access-snth4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-65bbp\" (UID: \"eadc18c4-d360-4730-ab18-0320e8c326a5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.859150 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.859048 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:45.980434 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.980408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp"] Apr 18 03:11:45.983030 ip-10-0-129-229 kubenswrapper[2574]: W0418 03:11:45.983001 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadc18c4_d360_4730_ab18_0320e8c326a5.slice/crio-4b18e815c35f20af2b9e7ca53d39746f161f979ef1083d82235f0d331d92d9c9 WatchSource:0}: Error finding container 4b18e815c35f20af2b9e7ca53d39746f161f979ef1083d82235f0d331d92d9c9: Status 404 returned error can't find the container with id 4b18e815c35f20af2b9e7ca53d39746f161f979ef1083d82235f0d331d92d9c9 Apr 18 03:11:45.985235 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:45.985212 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 03:11:46.889885 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:46.889854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" event={"ID":"eadc18c4-d360-4730-ab18-0320e8c326a5","Type":"ContainerStarted","Data":"e6584b66f294b1e636594cc55e55792357d04c2f71d2c6b560bdb1d38e9cf14b"} Apr 18 03:11:46.889885 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:46.889885 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" event={"ID":"eadc18c4-d360-4730-ab18-0320e8c326a5","Type":"ContainerStarted","Data":"4b18e815c35f20af2b9e7ca53d39746f161f979ef1083d82235f0d331d92d9c9"} Apr 18 03:11:46.890320 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:46.889910 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:11:46.909645 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:46.909606 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" podStartSLOduration=1.909593656 podStartE2EDuration="1.909593656s" podCreationTimestamp="2026-04-18 03:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 03:11:46.906961578 +0000 UTC m=+1544.630515473" watchObservedRunningTime="2026-04-18 03:11:46.909593656 +0000 UTC m=+1544.633147555" Apr 18 03:11:57.895198 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:11:57.895164 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-65bbp" Apr 18 03:12:05.497068 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:12:05.497029 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:12:10.688192 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:12:10.688155 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:12:37.387125 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:12:37.387083 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:12:43.897667 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:12:43.897585 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:12:53.087638 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:12:53.087600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:03.588742 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:03.588709 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:12.292904 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:12.288446 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:22.790763 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:22.790726 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:31.890590 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:31.890556 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:42.386768 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:42.386736 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:13:51.296857 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:13:51.296821 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:14:00.791160 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:14:00.791120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:14:10.093166 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:14:10.093132 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:14:15.687272 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:14:15.687189 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:14:43.191627 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:14:43.191584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:15:00.131461 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.131429 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:15:00.131867 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.131807 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:15:00.131867 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.131819 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b1c5a7-bbe5-4671-9878-5e76e01166d5" containerName="cleanup" Apr 18 03:15:00.134622 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.134606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:15:00.137025 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.137007 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-j9kkg\"" Apr 18 03:15:00.151284 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.151260 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:15:00.164982 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.164953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hzp\" (UniqueName: \"kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp\") pod \"maas-api-key-cleanup-29608035-zcsjc\" (UID: \"19e1b8ab-4fce-4213-8514-9638cc879b46\") " pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:15:00.265695 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.265658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hzp\" (UniqueName: \"kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp\") pod \"maas-api-key-cleanup-29608035-zcsjc\" (UID: \"19e1b8ab-4fce-4213-8514-9638cc879b46\") " pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:15:00.273379 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.273352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hzp\" (UniqueName: \"kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp\") pod \"maas-api-key-cleanup-29608035-zcsjc\" (UID: \"19e1b8ab-4fce-4213-8514-9638cc879b46\") " pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:15:00.444206 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.444125 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:15:00.770691 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:00.770661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:15:00.774878 ip-10-0-129-229 kubenswrapper[2574]: W0418 03:15:00.774838 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e1b8ab_4fce_4213_8514_9638cc879b46.slice/crio-521aa1017141cae668320402d546d428d0b75c5ce2174a070f1de4548b71d59e WatchSource:0}: Error finding container 521aa1017141cae668320402d546d428d0b75c5ce2174a070f1de4548b71d59e: Status 404 returned error can't find the container with id 521aa1017141cae668320402d546d428d0b75c5ce2174a070f1de4548b71d59e Apr 18 03:15:01.533072 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:01.533031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerStarted","Data":"478fc0b6f9282fc6186b0eec18a203430c983ff91bd0b0cf67f03ef5d3470b6c"} Apr 18 03:15:01.533072 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:01.533075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerStarted","Data":"521aa1017141cae668320402d546d428d0b75c5ce2174a070f1de4548b71d59e"} Apr 18 03:15:01.547594 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:01.547547 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" podStartSLOduration=1.547535205 podStartE2EDuration="1.547535205s" podCreationTimestamp="2026-04-18 03:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 03:15:01.54627037 +0000 UTC m=+1739.269824262" watchObservedRunningTime="2026-04-18 03:15:01.547535205 +0000 UTC m=+1739.271089099" Apr 18 03:15:21.600355 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:21.600294 2574 generic.go:358] "Generic (PLEG): container finished" podID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerID="478fc0b6f9282fc6186b0eec18a203430c983ff91bd0b0cf67f03ef5d3470b6c" exitCode=6 Apr 18 03:15:21.600804 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:21.600368 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerDied","Data":"478fc0b6f9282fc6186b0eec18a203430c983ff91bd0b0cf67f03ef5d3470b6c"} Apr 18 03:15:21.600804 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:21.600697 2574 scope.go:117] "RemoveContainer" containerID="478fc0b6f9282fc6186b0eec18a203430c983ff91bd0b0cf67f03ef5d3470b6c" Apr 18 03:15:22.604628 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:22.604592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerStarted","Data":"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927"} Apr 18 03:15:26.501276 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:26.501239 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:15:34.792568 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:34.792530 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:15:42.682357 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:42.682256 2574 generic.go:358] "Generic (PLEG): container finished" podID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" exitCode=6 Apr 18 03:15:42.682357 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:42.682329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerDied","Data":"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927"} Apr 18 03:15:42.682808 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:42.682368 2574 scope.go:117] "RemoveContainer" containerID="478fc0b6f9282fc6186b0eec18a203430c983ff91bd0b0cf67f03ef5d3470b6c" Apr 18 03:15:42.682808 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:42.682709 2574 scope.go:117] "RemoveContainer" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" Apr 18 03:15:42.682955 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:15:42.682934 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29608035-zcsjc_opendatahub(19e1b8ab-4fce-4213-8514-9638cc879b46)\"" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" Apr 18 03:15:43.292876 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:43.292843 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:15:51.395333 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:51.395283 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:15:52.880272 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:52.880233 2574 scope.go:117] "RemoveContainer" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" Apr 18 03:15:53.718173 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:53.718140 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerStarted","Data":"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef"} Apr 18 03:15:53.901938 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:53.901902 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:15:54.722130 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:15:54.722072 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" containerID="cri-o://2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef" gracePeriod=30 Apr 18 03:16:00.688710 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:00.688673 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:02.903789 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:02.903763 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:16:02.914442 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:02.914420 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:16:13.209734 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.209693 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:13.661312 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.661278 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:16:13.777803 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.777777 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6hzp\" (UniqueName: \"kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp\") pod \"19e1b8ab-4fce-4213-8514-9638cc879b46\" (UID: \"19e1b8ab-4fce-4213-8514-9638cc879b46\") " Apr 18 03:16:13.779761 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.779732 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp" (OuterVolumeSpecName: "kube-api-access-z6hzp") pod "19e1b8ab-4fce-4213-8514-9638cc879b46" (UID: "19e1b8ab-4fce-4213-8514-9638cc879b46"). InnerVolumeSpecName "kube-api-access-z6hzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 18 03:16:13.784953 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.784876 2574 generic.go:358] "Generic (PLEG): container finished" podID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerID="2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef" exitCode=6 Apr 18 03:16:13.785019 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.784952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerDied","Data":"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef"} Apr 18 03:16:13.785019 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.784967 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" Apr 18 03:16:13.785019 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.784980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29608035-zcsjc" event={"ID":"19e1b8ab-4fce-4213-8514-9638cc879b46","Type":"ContainerDied","Data":"521aa1017141cae668320402d546d428d0b75c5ce2174a070f1de4548b71d59e"} Apr 18 03:16:13.785019 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.784995 2574 scope.go:117] "RemoveContainer" containerID="2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef" Apr 18 03:16:13.794431 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.794413 2574 scope.go:117] "RemoveContainer" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" Apr 18 03:16:13.801231 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.801213 2574 scope.go:117] "RemoveContainer" containerID="2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef" Apr 18 03:16:13.801466 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:16:13.801436 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef\": container with ID starting with 2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef not found: ID does not exist" containerID="2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef" Apr 18 03:16:13.801527 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.801474 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef"} err="failed to get container status \"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef\": rpc error: code = NotFound desc = could not find container \"2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef\": container with ID starting with 2e69bc0b4f77020ec36965e8063d3fdb9116c2e7af386a4148cb27a935fedbef not found: ID does not exist" Apr 18 03:16:13.801527 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.801495 2574 scope.go:117] "RemoveContainer" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" Apr 18 03:16:13.801700 ip-10-0-129-229 kubenswrapper[2574]: E0418 03:16:13.801680 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927\": container with ID starting with 993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927 not found: ID does not exist" containerID="993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927" Apr 18 03:16:13.801745 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.801706 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927"} err="failed to get container status \"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927\": rpc error: code = NotFound desc = could not find container \"993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927\": container with ID starting with 993a9345f0847f5fc456a01dd711f5eac27a9d2f70b41022112ecbae464dc927 not found: ID does not exist" Apr 18 03:16:13.806132 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.806082 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:16:13.807845 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.807824 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29608035-zcsjc"] Apr 18 03:16:13.878409 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:13.878379 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6hzp\" (UniqueName: \"kubernetes.io/projected/19e1b8ab-4fce-4213-8514-9638cc879b46-kube-api-access-z6hzp\") on node \"ip-10-0-129-229.ec2.internal\" DevicePath \"\"" Apr 18 03:16:14.881755 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:14.881719 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" path="/var/lib/kubelet/pods/19e1b8ab-4fce-4213-8514-9638cc879b46/volumes" Apr 18 03:16:23.587809 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:23.587777 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:29.085887 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:29.085854 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:38.789079 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:38.789040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:47.097416 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:47.097375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:16:55.194943 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:16:55.194907 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:17:05.690328 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:17:05.690267 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:17:23.487807 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:17:23.487730 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:17:31.701700 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:17:31.701663 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:17:41.489703 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:17:41.489665 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:17:49.191289 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:17:49.191250 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:05.688087 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:05.688046 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:14.287896 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:14.287851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:23.093702 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:23.093664 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:31.285472 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:31.285428 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:40.085722 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:40.085679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:49.484670 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:49.484588 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:18:59.590794 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:18:59.590755 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:09.288764 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:09.288731 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:18.093015 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:18.092977 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:29.093909 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:29.093879 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:37.988082 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:37.988028 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:46.093378 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:46.093339 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:19:54.991821 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:19:54.991783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:20:02.889791 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:20:02.889760 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:20:19.788239 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:20:19.788166 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:20:27.987592 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:20:27.987559 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:20:36.794677 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:20:36.794637 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:20:45.091579 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:20:45.091548 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:21:02.926618 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:02.926586 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:21:02.938372 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:02.938344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:21:08.092701 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:08.092668 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:21:20.688716 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:20.688676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mssnc"] Apr 18 03:21:27.557947 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:27.557915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-wxnvb_43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc/manager/0.log" Apr 18 03:21:29.504584 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:29.504550 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-pmz6q_cfc76574-8ffc-4f6d-8e67-70e5f1404566/registry-server/0.log" Apr 18 03:21:29.626373 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:29.626293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-65bbp_eadc18c4-d360-4730-ab18-0320e8c326a5/manager/0.log" Apr 18 03:21:29.738787 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:29.738752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-mssnc_3a93b67f-aafb-45b2-a0d9-c65ab405c6e4/limitador/0.log" Apr 18 03:21:29.857279 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:29.857188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j56fz_aa80f70b-bc36-4170-ad0f-e28dd15fbe25/manager/0.log" Apr 18 03:21:35.505551 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505517 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bclg/must-gather-4tkwr"] Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505838 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505848 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505859 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505865 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505871 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.505876 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506550 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.506523 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506637 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.506555 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.506950 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.506932 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="19e1b8ab-4fce-4213-8514-9638cc879b46" containerName="cleanup" Apr 18 03:21:35.510823 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.510798 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.513531 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.513506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"openshift-service-ca.crt\"" Apr 18 03:21:35.513671 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.513524 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8bclg\"/\"default-dockercfg-mpz9c\"" Apr 18 03:21:35.513671 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.513524 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8bclg\"/\"kube-root-ca.crt\"" Apr 18 03:21:35.521729 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.521707 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/must-gather-4tkwr"] Apr 18 03:21:35.652870 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.652839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8j4\" (UniqueName: \"kubernetes.io/projected/1629dc4a-5f98-4864-b3d8-1c390b652bc8-kube-api-access-th8j4\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.653071 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.652936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1629dc4a-5f98-4864-b3d8-1c390b652bc8-must-gather-output\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.753647 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.753606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1629dc4a-5f98-4864-b3d8-1c390b652bc8-must-gather-output\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.753847 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.753702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th8j4\" (UniqueName: \"kubernetes.io/projected/1629dc4a-5f98-4864-b3d8-1c390b652bc8-kube-api-access-th8j4\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.753973 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.753950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1629dc4a-5f98-4864-b3d8-1c390b652bc8-must-gather-output\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.761712 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.761654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th8j4\" (UniqueName: \"kubernetes.io/projected/1629dc4a-5f98-4864-b3d8-1c390b652bc8-kube-api-access-th8j4\") pod \"must-gather-4tkwr\" (UID: \"1629dc4a-5f98-4864-b3d8-1c390b652bc8\") " pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.820546 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.820499 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/must-gather-4tkwr" Apr 18 03:21:35.934849 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.934826 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/must-gather-4tkwr"] Apr 18 03:21:35.936714 ip-10-0-129-229 kubenswrapper[2574]: W0418 03:21:35.936686 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1629dc4a_5f98_4864_b3d8_1c390b652bc8.slice/crio-74a8fb5685cc38bcc35f6d31f4a0374fb09fd397471bbec6d360eafb22c035ec WatchSource:0}: Error finding container 74a8fb5685cc38bcc35f6d31f4a0374fb09fd397471bbec6d360eafb22c035ec: Status 404 returned error can't find the container with id 74a8fb5685cc38bcc35f6d31f4a0374fb09fd397471bbec6d360eafb22c035ec Apr 18 03:21:35.938337 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:35.938317 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 18 03:21:36.852796 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:36.852738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/must-gather-4tkwr" event={"ID":"1629dc4a-5f98-4864-b3d8-1c390b652bc8","Type":"ContainerStarted","Data":"648ac707fc29df9016b90fbb0bc71f0a6fb207f6cc1b93250343ae9100f2b285"} Apr 18 03:21:36.852796 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:36.852783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/must-gather-4tkwr" event={"ID":"1629dc4a-5f98-4864-b3d8-1c390b652bc8","Type":"ContainerStarted","Data":"74a8fb5685cc38bcc35f6d31f4a0374fb09fd397471bbec6d360eafb22c035ec"} Apr 18 03:21:37.859678 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:37.859642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/must-gather-4tkwr" event={"ID":"1629dc4a-5f98-4864-b3d8-1c390b652bc8","Type":"ContainerStarted","Data":"a715c09cef72f7b3b9f372687c3a86251b394fdb47095f46f92fe7960f3fba83"} Apr 18 03:21:37.876036 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:37.875978 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bclg/must-gather-4tkwr" podStartSLOduration=2.137842289 podStartE2EDuration="2.875960062s" podCreationTimestamp="2026-04-18 03:21:35 +0000 UTC" firstStartedPulling="2026-04-18 03:21:35.938499924 +0000 UTC m=+2133.662053812" lastFinishedPulling="2026-04-18 03:21:36.676617701 +0000 UTC m=+2134.400171585" observedRunningTime="2026-04-18 03:21:37.873895016 +0000 UTC m=+2135.597448935" watchObservedRunningTime="2026-04-18 03:21:37.875960062 +0000 UTC m=+2135.599513957" Apr 18 03:21:38.329041 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:38.329007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mnwcq_b1fc92ab-29d4-4bbd-b21d-46be592b4ea0/global-pull-secret-syncer/0.log" Apr 18 03:21:38.532247 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:38.532214 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nctzt_c2517d1b-5d5f-4341-a8d1-b4646105d5ba/konnectivity-agent/0.log" Apr 18 03:21:38.575203 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:38.575173 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-229.ec2.internal_1d43c470c7e05ac28f68f1360471b9c3/haproxy/0.log" Apr 18 03:21:43.199414 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:43.199326 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-pmz6q_cfc76574-8ffc-4f6d-8e67-70e5f1404566/registry-server/0.log" Apr 18 03:21:43.283323 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:43.282818 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-65bbp_eadc18c4-d360-4730-ab18-0320e8c326a5/manager/0.log" Apr 18 03:21:43.305837 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:43.305808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-mssnc_3a93b67f-aafb-45b2-a0d9-c65ab405c6e4/limitador/0.log" Apr 18 03:21:43.407326 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:43.406506 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j56fz_aa80f70b-bc36-4170-ad0f-e28dd15fbe25/manager/0.log" Apr 18 03:21:44.830930 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.830893 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/alertmanager/0.log" Apr 18 03:21:44.857226 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.857195 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/config-reloader/0.log" Apr 18 03:21:44.878573 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.878532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/kube-rbac-proxy-web/0.log" Apr 18 03:21:44.900534 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.900494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/kube-rbac-proxy/0.log" Apr 18 03:21:44.922547 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.922515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/kube-rbac-proxy-metric/0.log" Apr 18 03:21:44.941972 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.941943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/prom-label-proxy/0.log" Apr 18 03:21:44.966357 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:44.966323 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b0bb71d9-00ac-459d-ae0e-8903fc82fbac/init-config-reloader/0.log" Apr 18 03:21:45.368036 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.368005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x8zpl_f444b9d4-dc64-4f2e-af4b-e10735803a6c/node-exporter/0.log" Apr 18 03:21:45.385437 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.385405 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x8zpl_f444b9d4-dc64-4f2e-af4b-e10735803a6c/kube-rbac-proxy/0.log" Apr 18 03:21:45.407941 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.407894 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x8zpl_f444b9d4-dc64-4f2e-af4b-e10735803a6c/init-textfile/0.log" Apr 18 03:21:45.432945 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.432910 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n2swh_10c0ceae-dfce-406f-879e-37341f5509ae/kube-rbac-proxy-main/0.log" Apr 18 03:21:45.452595 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.452518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n2swh_10c0ceae-dfce-406f-879e-37341f5509ae/kube-rbac-proxy-self/0.log" Apr 18 03:21:45.480695 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.480663 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n2swh_10c0ceae-dfce-406f-879e-37341f5509ae/openshift-state-metrics/0.log" Apr 18 03:21:45.738869 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.738836 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfccf456c-x8f55_f58fbfc9-fb74-4fef-a934-0dc44a590232/telemeter-client/0.log" Apr 18 03:21:45.759264 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.759230 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfccf456c-x8f55_f58fbfc9-fb74-4fef-a934-0dc44a590232/reload/0.log" Apr 18 03:21:45.792012 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.791980 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfccf456c-x8f55_f58fbfc9-fb74-4fef-a934-0dc44a590232/kube-rbac-proxy/0.log" Apr 18 03:21:45.820442 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.820413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/thanos-query/0.log" Apr 18 03:21:45.850058 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.850033 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/kube-rbac-proxy-web/0.log" Apr 18 03:21:45.871216 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.871186 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/kube-rbac-proxy/0.log" Apr 18 03:21:45.892106 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.892067 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/prom-label-proxy/0.log" Apr 18 03:21:45.911197 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.911168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/kube-rbac-proxy-rules/0.log" Apr 18 03:21:45.929872 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:45.929831 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-694d7b6dc8-42f9f_1ad73780-435a-4c42-a33f-a16127970a0b/kube-rbac-proxy-metrics/0.log" Apr 18 03:21:46.525901 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.525867 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9"] Apr 18 03:21:46.529612 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.529591 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.536515 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.536486 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9"] Apr 18 03:21:46.653810 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.653775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-lib-modules\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.654011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.653841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-podres\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.654011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.653903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-sys\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.654011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.653939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-proc\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.654011 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.653979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrnz\" (UniqueName: \"kubernetes.io/projected/83a21063-3a47-457b-9bef-53d5faef0061-kube-api-access-gwrnz\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754764 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-proc\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754764 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrnz\" (UniqueName: \"kubernetes.io/projected/83a21063-3a47-457b-9bef-53d5faef0061-kube-api-access-gwrnz\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754979 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-lib-modules\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754979 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-proc\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754979 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-podres\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.754979 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-sys\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.755141 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.754991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-lib-modules\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.755141 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.755006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-sys\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.755141 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.755064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/83a21063-3a47-457b-9bef-53d5faef0061-podres\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.763168 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.763139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrnz\" (UniqueName: \"kubernetes.io/projected/83a21063-3a47-457b-9bef-53d5faef0061-kube-api-access-gwrnz\") pod \"perf-node-gather-daemonset-4tfl9\" (UID: \"83a21063-3a47-457b-9bef-53d5faef0061\") " pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:46.842393 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:46.842283 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:47.000748 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.000715 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9"] Apr 18 03:21:47.905023 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.904986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" event={"ID":"83a21063-3a47-457b-9bef-53d5faef0061","Type":"ContainerStarted","Data":"fb758ca3a65bc139931214abbcabbb946fb047cbbd19211aea1e5c5d67c8acfd"} Apr 18 03:21:47.905023 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.905022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" event={"ID":"83a21063-3a47-457b-9bef-53d5faef0061","Type":"ContainerStarted","Data":"4851cb232fbef1dd514d75bff56c99a454f16213b658d03c892cb31338af4013"} Apr 18 03:21:47.905257 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.905140 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:47.919991 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.919934 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" podStartSLOduration=1.919920999 podStartE2EDuration="1.919920999s" podCreationTimestamp="2026-04-18 03:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-18 03:21:47.919003797 +0000 UTC m=+2145.642557684" watchObservedRunningTime="2026-04-18 03:21:47.919920999 +0000 UTC m=+2145.643474952" Apr 18 03:21:47.966763 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:47.966726 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-768kk_3ac514cd-745b-418c-99dc-a1392887694c/download-server/0.log" Apr 18 03:21:49.159330 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:49.159280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jmr6g_cf5032f8-0827-4cc5-8381-d39ca8db84ee/dns/0.log" Apr 18 03:21:49.177189 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:49.177168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jmr6g_cf5032f8-0827-4cc5-8381-d39ca8db84ee/kube-rbac-proxy/0.log" Apr 18 03:21:49.262860 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:49.262830 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jrk2r_8976a474-462c-4893-ac54-7572b4e92f46/dns-node-resolver/0.log" Apr 18 03:21:49.728728 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:49.728697 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wx6b_df7f443e-28b4-49bc-ad27-c0360b16827c/node-ca/0.log" Apr 18 03:21:51.234738 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:51.234705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-59c6n_3be6e6c6-5134-4428-888c-4efe46336918/serve-healthcheck-canary/0.log" Apr 18 03:21:51.766696 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:51.766666 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dgzpp_0881f3b0-2ddd-4bb1-8847-3636e2d0c097/kube-rbac-proxy/0.log" Apr 18 03:21:51.784234 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:51.784208 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dgzpp_0881f3b0-2ddd-4bb1-8847-3636e2d0c097/exporter/0.log" Apr 18 03:21:51.803442 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:51.803416 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dgzpp_0881f3b0-2ddd-4bb1-8847-3636e2d0c097/extractor/0.log" Apr 18 03:21:53.812216 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:53.812184 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b6bf46549-wxnvb_43a43cbc-36cb-4386-a2a6-c1d3b65dc1cc/manager/0.log" Apr 18 03:21:53.921317 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:53.921270 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8bclg/perf-node-gather-daemonset-4tfl9" Apr 18 03:21:55.100857 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:21:55.100809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5dd789dc9-m6q4w_e939e98f-077c-4bc7-867b-395379862646/manager/0.log" Apr 18 03:22:00.646254 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.646218 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/kube-multus-additional-cni-plugins/0.log" Apr 18 03:22:00.665066 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.665039 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/egress-router-binary-copy/0.log" Apr 18 03:22:00.682854 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.682838 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/cni-plugins/0.log" Apr 18 03:22:00.700789 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.700768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/bond-cni-plugin/0.log" Apr 18 03:22:00.721054 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.721031 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/routeoverride-cni/0.log" Apr 18 03:22:00.747495 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.747475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/whereabouts-cni-bincopy/0.log" Apr 18 03:22:00.766092 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:00.766072 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4xn6_ba7fee3b-25c5-45b5-93bd-fe87ba08395f/whereabouts-cni/0.log" Apr 18 03:22:01.095114 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:01.095078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dbmjj_841058db-3583-4d11-853d-8a1d444e8ea6/kube-multus/0.log" Apr 18 03:22:01.156485 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:01.156451 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c6w8h_eaf422fa-fd33-491a-b182-991116468c18/network-metrics-daemon/0.log" Apr 18 03:22:01.172538 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:01.172514 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c6w8h_eaf422fa-fd33-491a-b182-991116468c18/kube-rbac-proxy/0.log" Apr 18 03:22:02.269648 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.269617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-controller/0.log" Apr 18 03:22:02.284009 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.283982 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/0.log" Apr 18 03:22:02.303271 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.303245 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovn-acl-logging/1.log" Apr 18 03:22:02.323765 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.323737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/kube-rbac-proxy-node/0.log" Apr 18 03:22:02.345160 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.345118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/kube-rbac-proxy-ovn-metrics/0.log" Apr 18 03:22:02.360195 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.360168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/northd/0.log" Apr 18 03:22:02.377753 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.377722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/nbdb/0.log" Apr 18 03:22:02.395681 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.395661 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/sbdb/0.log" Apr 18 03:22:02.565026 ip-10-0-129-229 kubenswrapper[2574]: I0418 03:22:02.564955 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8fsv_d159849f-3b4d-45b7-8f49-9f9f11d96088/ovnkube-controller/0.log"