Apr 16 15:09:12.085792 ip-10-0-129-254 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 15:09:12.085804 ip-10-0-129-254 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 15:09:12.085811 ip-10-0-129-254 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 15:09:12.086066 ip-10-0-129-254 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 15:09:23.232277 ip-10-0-129-254 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 15:09:23.232293 ip-10-0-129-254 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0d5bd66297864dfca8c8c30a9952a895 -- Apr 16 15:11:46.434689 ip-10-0-129-254 systemd[1]: Starting Kubernetes Kubelet... Apr 16 15:11:46.893550 ip-10-0-129-254 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:46.893550 ip-10-0-129-254 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 15:11:46.893550 ip-10-0-129-254 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:46.893550 ip-10-0-129-254 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 15:11:46.893550 ip-10-0-129-254 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 15:11:46.895062 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.894949 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 15:11:46.900778 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900730 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:46.900778 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900776 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:46.900778 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900781 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900785 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900788 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900791 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900795 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900798 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900801 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900803 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900806 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900809 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900812 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900814 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900817 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900820 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900823 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900825 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900828 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900830 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900833 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:46.900920 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900836 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900840 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900860 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900863 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900866 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900869 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900872 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900874 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900877 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900880 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900883 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900886 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900888 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900891 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900893 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900899 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900903 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900905 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900908 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:46.901379 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900911 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900913 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900916 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900919 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900922 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900925 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900928 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900931 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900934 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900937 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900939 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900942 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900944 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900947 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900950 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900953 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900955 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900958 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900960 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900963 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:46.901864 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900965 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900968 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900971 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900974 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900976 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900978 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900981 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900984 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900987 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900990 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900992 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900995 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.900998 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901000 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901004 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901006 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901009 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901011 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901015 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901017 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:46.902353 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901020 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901022 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901024 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901027 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901029 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901032 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901432 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901438 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901442 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901445 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901448 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901451 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901454 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901457 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901461 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901463 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901466 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901469 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901471 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:46.902860 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901474 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901477 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901480 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901483 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901485 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901488 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901491 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901493 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901496 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901499 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901501 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901504 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901507 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901510 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901512 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901515 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901517 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901520 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901522 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901525 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:46.903319 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901528 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901530 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901533 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901536 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901538 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901541 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901544 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901547 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901551 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901554 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901556 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901558 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901561 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901564 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901567 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901570 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901572 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901575 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901577 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:46.903857 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901580 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901582 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901585 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901587 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901590 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901593 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901595 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901598 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901600 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901603 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901605 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901608 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901610 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901613 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901615 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901620 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901623 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901626 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901628 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:46.904324 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901631 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901633 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901636 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901639 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901641 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901645 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901648 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901652 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901655 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901657 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901660 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901663 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901665 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901668 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.901670 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901743 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901750 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901756 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901760 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901765 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901768 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901773 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 15:11:46.904794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901777 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901781 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901783 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901787 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901790 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901793 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901796 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901799 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901802 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901805 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901808 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901811 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901815 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901818 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901821 2575 flags.go:64] FLAG: --config-dir="" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901824 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901827 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901832 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901835 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901838 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901841 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901844 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901847 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901850 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901853 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 15:11:46.905337 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901856 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901860 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901863 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901866 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901869 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901873 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901876 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901881 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901884 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901887 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901891 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901894 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901897 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901900 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901903 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901906 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901909 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901912 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901915 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901918 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901921 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901924 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901927 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901931 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901934 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 15:11:46.906046 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901937 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901940 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901943 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901946 2575 flags.go:64] FLAG: --help="false" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901949 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-254.ec2.internal" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901952 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901955 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901958 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901961 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901964 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901967 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901969 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901972 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901976 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901979 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901982 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901985 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901988 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901991 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901994 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.901997 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902000 2575 flags.go:64] FLAG: --lock-file="" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902002 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902005 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 15:11:46.906666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902008 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902014 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902016 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902019 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902022 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902025 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902029 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902032 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902035 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902039 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902042 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902046 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902049 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902052 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902055 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902061 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902064 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902067 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902070 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902077 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902080 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902083 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902086 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 15:11:46.907230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902089 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902094 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902097 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902100 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902103 2575 flags.go:64] FLAG: --port="10250" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902106 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902109 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02040f1a5bac09b03" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902112 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902115 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902118 2575 flags.go:64] FLAG: --register-node="true" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902121 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902124 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902127 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902130 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902133 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902136 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902140 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902143 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902146 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902148 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902151 2575 flags.go:64] FLAG: --runonce="false" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902154 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902157 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902160 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902168 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902170 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 15:11:46.907829 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902173 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902177 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902179 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902182 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902185 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902188 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902191 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902195 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902198 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902201 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902206 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902209 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902212 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902216 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902218 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902221 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902224 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902227 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902230 2575 flags.go:64] FLAG: --v="2" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902236 2575 flags.go:64] FLAG: --version="false" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902240 2575 flags.go:64] FLAG: --vmodule="" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902244 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.902247 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902331 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902334 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:46.908464 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902337 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902340 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902343 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902346 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902348 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902353 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902356 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902359 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902362 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902366 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902370 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902373 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902376 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902379 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902382 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902385 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902387 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902390 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902393 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:46.909147 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902395 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902398 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902401 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902404 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902406 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902409 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902411 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902415 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902433 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902437 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902441 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902445 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902449 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902452 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902454 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902457 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902460 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902463 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902467 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902470 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:46.910006 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902473 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902476 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902478 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902481 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902484 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902486 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902489 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902491 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902494 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902497 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902500 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902502 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902505 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902508 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902512 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902515 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902518 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902520 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902523 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:46.910880 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902527 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902529 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902532 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902535 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902537 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902540 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902542 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902545 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902548 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902550 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902552 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902556 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902559 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902561 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902564 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902566 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902569 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902571 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902574 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902576 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:46.911714 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902579 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902582 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902585 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902587 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902590 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.902593 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.903223 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.911024 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.911047 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911136 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911145 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911150 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911155 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911160 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911165 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911170 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:46.912549 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911175 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911180 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911184 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911189 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911193 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911198 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911203 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911207 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911211 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911215 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911219 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911223 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911229 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911234 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911239 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911244 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911248 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911252 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911256 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:46.913046 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911261 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911265 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911269 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911273 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911277 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911284 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911288 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911295 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911302 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911306 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911310 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911315 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911319 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911323 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911327 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911332 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911336 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911340 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911344 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911348 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:46.913711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911352 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911356 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911360 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911365 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911369 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911373 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911377 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911381 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911385 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911389 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911393 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911398 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911402 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911407 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911411 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911415 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911435 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911440 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911445 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911449 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:46.914290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911454 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911460 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911466 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911471 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911476 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911480 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911484 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911489 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911493 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911498 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911502 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911506 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911511 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911515 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911519 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911523 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911527 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911531 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911535 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:46.914900 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911539 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.911548 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911731 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911740 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911746 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911751 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911757 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911761 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911765 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911771 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911778 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911782 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911787 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911792 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911796 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 15:11:46.915703 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911800 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911804 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911809 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911813 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911817 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911821 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911825 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911828 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911833 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911839 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911845 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911850 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911854 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911859 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911864 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911868 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911872 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911877 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911882 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911886 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 15:11:46.916120 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911891 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911895 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911899 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911903 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911907 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911911 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911915 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911920 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911924 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911928 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911934 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911938 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911942 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911946 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911950 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911955 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911959 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911963 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911967 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 15:11:46.916750 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911971 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911975 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911980 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911984 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911988 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911992 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.911996 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912000 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912004 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912008 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912012 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912017 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912021 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912025 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912029 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912033 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912037 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912041 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912045 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912049 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 15:11:46.917224 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912053 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912058 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912062 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912066 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912071 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912075 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912079 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912083 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912087 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912091 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912095 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912099 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912104 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:46.912108 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.912115 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 15:11:46.917830 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.912790 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 15:11:46.918235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.916276 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 15:11:46.918235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.917273 2575 server.go:1019] "Starting client certificate rotation" Apr 16 15:11:46.918235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.917368 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:11:46.918235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.917407 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 15:11:46.941018 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.940994 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:11:46.945165 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.945149 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 15:11:46.965026 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.965003 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 15:11:46.972143 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.972123 2575 log.go:25] "Validated CRI v1 image API" Apr 16 15:11:46.973557 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.973542 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 15:11:46.977509 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.977491 2575 fs.go:135] Filesystem UUIDs: map[70f9ff8f-ffc3-4f2f-ad6a-655b1f918125:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f6d56919-b4d8-4ca0-b58b-153ea316e7bb:/dev/nvme0n1p3] Apr 16 15:11:46.977575 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.977509 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 15:11:46.982656 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.982538 2575 manager.go:217] Machine: {Timestamp:2026-04-16 15:11:46.981135018 +0000 UTC m=+0.420675569 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099854 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23b269643ae736d9176ed7e3fd7116 SystemUUID:ec23b269-643a-e736-d917-6ed7e3fd7116 BootID:0d5bd662-9786-4dfc-a8c8-c30a9952a895 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:35:e3:4a:33:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:35:e3:4a:33:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:e6:2b:1a:ce:3b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 15:11:46.982656 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.982650 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 15:11:46.982762 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.982727 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 15:11:46.984994 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.984966 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 15:11:46.985154 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.984996 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-254.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 15:11:46.985206 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.985167 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 15:11:46.985206 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.985176 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 15:11:46.985206 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.985189 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:11:46.986078 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.986067 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 15:11:46.988228 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.988217 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:11:46.988360 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.988350 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 15:11:46.991734 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.991724 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 15:11:46.991774 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.991742 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 15:11:46.991774 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.991754 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 15:11:46.991774 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.991763 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 15:11:46.991933 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.991771 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 15:11:46.992892 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.992877 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:11:46.992892 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.992895 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 15:11:46.995281 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.995262 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:11:46.996286 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.996264 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 15:11:46.997926 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.997912 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 15:11:46.999436 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999412 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999442 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999448 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999456 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999463 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999469 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999475 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 15:11:46.999477 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999480 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 15:11:46.999659 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999487 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 15:11:46.999659 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999493 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 15:11:46.999659 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999510 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 15:11:46.999659 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:46.999519 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 15:11:47.000236 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.000227 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 15:11:47.000267 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.000237 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 15:11:47.003495 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.003481 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 15:11:47.003562 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.003515 2575 server.go:1295] "Started kubelet" Apr 16 15:11:47.003642 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.003597 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 15:11:47.003681 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.003629 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 15:11:47.003681 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.003660 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 15:11:47.004179 ip-10-0-129-254 systemd[1]: Started Kubernetes Kubelet. Apr 16 15:11:47.004798 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.004783 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 15:11:47.006577 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.006555 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 15:11:47.015813 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.015797 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 15:11:47.016565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.016549 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 15:11:47.017350 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.017313 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 15:11:47.017456 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.017437 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 15:11:47.017532 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017511 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 15:11:47.017532 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017529 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 15:11:47.017694 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017616 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-254.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 15:11:47.017694 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017654 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 15:11:47.017694 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.017673 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.017844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017715 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 15:11:47.017844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017724 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 15:11:47.017844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017762 2575 factory.go:55] Registering systemd factory Apr 16 15:11:47.017844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017776 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 15:11:47.017844 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.017803 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-254.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 15:11:47.018060 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.017991 2575 factory.go:153] Registering CRI-O factory Apr 16 15:11:47.018060 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.018005 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 15:11:47.018133 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.018072 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 15:11:47.018133 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.018108 2575 factory.go:103] Registering Raw factory Apr 16 15:11:47.018133 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.018122 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 15:11:47.018472 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.017341 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-254.ec2.internal.18a6df02fee389a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-254.ec2.internal,UID:ip-10-0-129-254.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-254.ec2.internal,},FirstTimestamp:2026-04-16 15:11:47.003492777 +0000 UTC m=+0.443033329,LastTimestamp:2026-04-16 15:11:47.003492777 +0000 UTC m=+0.443033329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-254.ec2.internal,}" Apr 16 15:11:47.018770 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.018756 2575 manager.go:319] Starting recovery of all containers Apr 16 15:11:47.021467 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.021437 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 15:11:47.021581 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.021544 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-254.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 15:11:47.028824 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.028810 2575 manager.go:324] Recovery completed Apr 16 15:11:47.032486 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.032473 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.035028 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035001 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.035104 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035044 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.035104 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035054 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.035512 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035499 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 15:11:47.035580 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035513 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 15:11:47.035580 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.035539 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 15:11:47.037295 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.037233 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-254.ec2.internal.18a6df0300c4c7a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-254.ec2.internal,UID:ip-10-0-129-254.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-254.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-254.ec2.internal,},FirstTimestamp:2026-04-16 15:11:47.035031459 +0000 UTC m=+0.474572011,LastTimestamp:2026-04-16 15:11:47.035031459 +0000 UTC m=+0.474572011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-254.ec2.internal,}" Apr 16 15:11:47.039264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.039251 2575 policy_none.go:49] "None policy: Start" Apr 16 15:11:47.039342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.039269 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 15:11:47.039342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.039282 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 15:11:47.044581 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.044448 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-254.ec2.internal.18a6df0300c50c9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-254.ec2.internal,UID:ip-10-0-129-254.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-254.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-254.ec2.internal,},FirstTimestamp:2026-04-16 15:11:47.035049116 +0000 UTC m=+0.474589667,LastTimestamp:2026-04-16 15:11:47.035049116 +0000 UTC m=+0.474589667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-254.ec2.internal,}" Apr 16 15:11:47.069138 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.069077 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-254.ec2.internal.18a6df0300c5300c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-254.ec2.internal,UID:ip-10-0-129-254.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-254.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-254.ec2.internal,},FirstTimestamp:2026-04-16 15:11:47.035058188 +0000 UTC m=+0.474598738,LastTimestamp:2026-04-16 15:11:47.035058188 +0000 UTC m=+0.474598738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-254.ec2.internal,}" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075609 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.075637 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075646 2575 server.go:85] "Starting device plugin registration server" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075832 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075841 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075909 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075989 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.075998 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.076437 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 15:11:47.083166 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.076470 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.085294 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.085278 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7ncnl" Apr 16 15:11:47.093391 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.093332 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-254.ec2.internal.18a6df0303acfc88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-254.ec2.internal,UID:ip-10-0-129-254.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-129-254.ec2.internal,},FirstTimestamp:2026-04-16 15:11:47.083803784 +0000 UTC m=+0.523344322,LastTimestamp:2026-04-16 15:11:47.083803784 +0000 UTC m=+0.523344322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-254.ec2.internal,}" Apr 16 15:11:47.094998 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.094984 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7ncnl" Apr 16 15:11:47.144997 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.144947 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 15:11:47.146135 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.146120 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 15:11:47.146185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.146145 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 15:11:47.146185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.146160 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 15:11:47.146185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.146170 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 15:11:47.146326 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.146210 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 15:11:47.168525 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.168502 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:47.176484 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.176472 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.177190 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.177175 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.177261 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.177201 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.177261 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.177210 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.177261 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.177234 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.194562 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.194545 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.194629 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.194564 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-254.ec2.internal\": node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.246629 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.246607 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal"] Apr 16 15:11:47.246711 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.246663 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.246937 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.246924 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.248152 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.248136 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.248216 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.248162 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.248216 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.248172 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.249268 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.249257 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.249438 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.249408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.249499 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.249457 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.251082 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251064 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.251082 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251071 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.251170 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251093 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.251170 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251104 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.251170 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251093 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.251254 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.251178 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.252638 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.252620 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.252721 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.252643 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 15:11:47.253334 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.253319 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientMemory" Apr 16 15:11:47.253394 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.253349 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 15:11:47.253394 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.253363 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeHasSufficientPID" Apr 16 15:11:47.281944 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.281928 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-254.ec2.internal\" not found" node="ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.286102 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.286088 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-254.ec2.internal\" not found" node="ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.319242 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.319226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.319313 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.319252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.319313 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.319269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840c483ab6972a395b52b57432ebf0a1-config\") pod \"kube-apiserver-proxy-ip-10-0-129-254.ec2.internal\" (UID: \"840c483ab6972a395b52b57432ebf0a1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.347343 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.347324 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.419793 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.419793 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.419793 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.419946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840c483ab6972a395b52b57432ebf0a1-config\") pod \"kube-apiserver-proxy-ip-10-0-129-254.ec2.internal\" (UID: \"840c483ab6972a395b52b57432ebf0a1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.419946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840c483ab6972a395b52b57432ebf0a1-config\") pod \"kube-apiserver-proxy-ip-10-0-129-254.ec2.internal\" (UID: \"840c483ab6972a395b52b57432ebf0a1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.419946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.419873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec7d649cd375bc3470d0cc6b98df2f46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal\" (UID: \"ec7d649cd375bc3470d0cc6b98df2f46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.448134 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.448114 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.549115 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.549096 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.585791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.585773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.588154 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.588137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:47.650076 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.650060 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.750899 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.750846 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.851327 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.851305 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.917514 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.917485 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 15:11:47.917936 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.917651 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 15:11:47.951978 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:47.951955 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-254.ec2.internal\" not found" Apr 16 15:11:47.982535 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.982520 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:47.992403 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:47.992386 2575 apiserver.go:52] "Watching apiserver" Apr 16 15:11:48.006234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.006190 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 15:11:48.006574 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.006536 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-k26s9","openshift-multus/multus-zwlx7","openshift-network-operator/iptables-alerter-xtz8k","kube-system/konnectivity-agent-npqcs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x","openshift-cluster-node-tuning-operator/tuned-qpc9j","openshift-multus/network-metrics-daemon-whwdh","openshift-network-diagnostics/network-check-target-5d795","openshift-ovn-kubernetes/ovnkube-node-8dtn7","openshift-dns/node-resolver-kjtwg","openshift-image-registry/node-ca-thxgc"] Apr 16 15:11:48.008717 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.008697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.009752 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.009733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.011171 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.011154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.011273 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.011247 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.012434 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.012405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.013497 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.013482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.014691 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.014677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.014833 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.014751 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:48.014909 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.014892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 15:11:48.015920 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.015903 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 15:11:48.016142 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.016128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:48.016206 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.016173 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:48.018386 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.018151 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" Apr 16 15:11:48.018781 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.018762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.021672 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.021652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.023394 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cca38e9-1e0a-4179-87b0-74cf0d052206-agent-certs\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-sys\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023444 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2g2\" (UniqueName: \"kubernetes.io/projected/90e76464-4794-4a6f-bdfc-1010042e6181-kube-api-access-xw2g2\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-lib-modules\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-kubelet\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbwn9\" (UniqueName: \"kubernetes.io/projected/60c8f519-fd4b-490b-8dd9-e26316afc045-kube-api-access-mbwn9\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-socket-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-systemd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-ovn\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-system-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-conf-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-os-release\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60c8f519-fd4b-490b-8dd9-e26316afc045-iptables-alerter-script\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysconfig\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-env-overrides\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-cnibin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.023978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-device-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-systemd-units\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.023985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-multus-certs\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-modprobe-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-tuned\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksrl\" (UniqueName: \"kubernetes.io/projected/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-kube-api-access-4ksrl\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkcz\" (UniqueName: \"kubernetes.io/projected/9973bf97-babd-47b9-a129-38dbed119c77-kube-api-access-jwkcz\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-config\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-node-log\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-os-release\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-cni-binary-copy\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-system-cni-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cnibin\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-etc-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.024739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtb7g\" (UniqueName: \"kubernetes.io/projected/8c646c34-edd0-4bb7-ac77-7a47bafd421b-kube-api-access-rtb7g\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-multus\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-netd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-script-lib\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-socket-dir-parent\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-k8s-cni-cncf-io\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-registration-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-kubernetes\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-systemd\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-run\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-slash\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-kubelet\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-hostroot\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-binary-copy\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.025630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60c8f519-fd4b-490b-8dd9-e26316afc045-host-slash\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-log-socket\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-bin\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-multus-daemon-config\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4b7k\" (UniqueName: \"kubernetes.io/projected/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-kube-api-access-t4b7k\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-netns\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cca38e9-1e0a-4179-87b0-74cf0d052206-konnectivity-ca\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-conf\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-tmp\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-netns\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.024994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-var-lib-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-etc-kubernetes\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-sys-fs\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-var-lib-kubelet\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-bin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.026446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-host\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.027189 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.027189 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.025158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p74kx\" (UniqueName: \"kubernetes.io/projected/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kube-api-access-p74kx\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.027848 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.027832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.027899 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.027849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 15:11:48.027930 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.027849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.028856 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.028802 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 15:11:48.029097 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.029077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 15:11:48.029097 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.029096 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kclpt\"" Apr 16 15:11:48.029219 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.029128 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-stlgx\"" Apr 16 15:11:48.040535 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.040639 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040605 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.040701 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040643 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 15:11:48.040773 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 15:11:48.040843 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040782 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.040843 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040822 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 15:11:48.040955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 15:11:48.040955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.040955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040915 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.040955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.040933 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.041120 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041021 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 15:11:48.041170 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041160 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.041336 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.041879 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041859 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.041962 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.041962 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.041860 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 15:11:48.042296 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 15:11:48.042296 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 15:11:48.042602 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042576 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wjgzc\"" Apr 16 15:11:48.042699 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042619 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-756ks\"" Apr 16 15:11:48.042699 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042634 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dn9rz\"" Apr 16 15:11:48.042699 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 15:11:48.042699 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042576 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lzhhw\"" Apr 16 15:11:48.042903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042788 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5pcmx\"" Apr 16 15:11:48.042903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042848 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qw9hk\"" Apr 16 15:11:48.042903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8824q\"" Apr 16 15:11:48.043043 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 15:11:48.043043 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.042963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 15:11:48.047467 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.047449 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal"] Apr 16 15:11:48.063958 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.063942 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:11:48.064034 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.064001 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" Apr 16 15:11:48.068265 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.068251 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 15:11:48.097233 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.097203 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:06:47 +0000 UTC" deadline="2027-10-20 15:41:52.51354663 +0000 UTC" Apr 16 15:11:48.097233 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.097232 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13248h30m4.416317197s" Apr 16 15:11:48.110295 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.110275 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 15:11:48.110480 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.110463 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal"] Apr 16 15:11:48.118318 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.118303 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 15:11:48.125572 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.125674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2g2\" (UniqueName: \"kubernetes.io/projected/90e76464-4794-4a6f-bdfc-1010042e6181-kube-api-access-xw2g2\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.125674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clqh\" (UniqueName: \"kubernetes.io/projected/74d85d78-e8d8-4b5c-a950-f65047122164-kube-api-access-6clqh\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.125674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-lib-modules\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-kubelet\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbwn9\" (UniqueName: \"kubernetes.io/projected/60c8f519-fd4b-490b-8dd9-e26316afc045-kube-api-access-mbwn9\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-socket-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.125832 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-systemd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.125913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-systemd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-socket-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-ovn\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126167 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-system-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-lib-modules\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-run-ovn\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-conf-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-conf-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-os-release\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-system-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-kubelet\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60c8f519-fd4b-490b-8dd9-e26316afc045-iptables-alerter-script\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-os-release\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysconfig\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysconfig\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-env-overrides\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-cnibin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-device-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74d85d78-e8d8-4b5c-a950-f65047122164-hosts-file\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.126611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b5389a-1cfb-46bd-bdee-65b24755f000-host\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-cnibin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-systemd-units\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-multus-certs\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-device-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-modprobe-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-tuned\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksrl\" (UniqueName: \"kubernetes.io/projected/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-kube-api-access-4ksrl\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkcz\" (UniqueName: \"kubernetes.io/projected/9973bf97-babd-47b9-a129-38dbed119c77-kube-api-access-jwkcz\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-modprobe-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-config\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-multus-certs\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6b5389a-1cfb-46bd-bdee-65b24755f000-serviceca\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-node-log\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-os-release\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-cni-binary-copy\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.127380 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-system-cni-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cnibin\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-env-overrides\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-etc-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60c8f519-fd4b-490b-8dd9-e26316afc045-iptables-alerter-script\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-systemd-units\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-etc-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.126987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-d\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtb7g\" (UniqueName: \"kubernetes.io/projected/8c646c34-edd0-4bb7-ac77-7a47bafd421b-kube-api-access-rtb7g\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-cni-dir\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-system-cni-dir\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cnibin\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-os-release\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-multus\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-netd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127344 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-script-lib\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-socket-dir-parent\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-k8s-cni-cncf-io\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-config\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-registration-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-kubernetes\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-multus\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-systemd\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-node-log\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-run\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-cni-binary-copy\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-slash\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-kubelet\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-hostroot\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-netd\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-run\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.128928 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-multus-socket-dir-parent\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.127774 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7d649cd375bc3470d0cc6b98df2f46.slice/crio-bf5515f96b8868803aed8bb7bf3525297f2b1eecaf0ce9b613e5eb5ec79344bb WatchSource:0}: Error finding container bf5515f96b8868803aed8bb7bf3525297f2b1eecaf0ce9b613e5eb5ec79344bb: Status 404 returned error can't find the container with id bf5515f96b8868803aed8bb7bf3525297f2b1eecaf0ce9b613e5eb5ec79344bb Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-slash\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-systemd\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-kubernetes\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbff\" (UniqueName: \"kubernetes.io/projected/b6b5389a-1cfb-46bd-bdee-65b24755f000-kube-api-access-8mbff\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-k8s-cni-cncf-io\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-binary-copy\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-kubelet\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60c8f519-fd4b-490b-8dd9-e26316afc045-host-slash\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-registration-dir\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-hostroot\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-log-socket\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.127998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-bin\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-multus-daemon-config\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4b7k\" (UniqueName: \"kubernetes.io/projected/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-kube-api-access-t4b7k\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-netns\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.129610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cca38e9-1e0a-4179-87b0-74cf0d052206-konnectivity-ca\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-binary-copy\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovnkube-script-lib\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.128539 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840c483ab6972a395b52b57432ebf0a1.slice/crio-5f7ac49a7861317fab1b36c23f5b9614f7fd72b33fa992dabca8b05dec352764 WatchSource:0}: Error finding container 5f7ac49a7861317fab1b36c23f5b9614f7fd72b33fa992dabca8b05dec352764: Status 404 returned error can't find the container with id 5f7ac49a7861317fab1b36c23f5b9614f7fd72b33fa992dabca8b05dec352764 Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-conf\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-tmp\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-netns\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-var-lib-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-etc-kubernetes\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-sys-fs\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.128803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-var-lib-kubelet\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-bin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d85d78-e8d8-4b5c-a950-f65047122164-tmp-dir\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-host\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.130413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.129925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-var-lib-cni-bin\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.130478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-var-lib-kubelet\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.130538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/90e76464-4794-4a6f-bdfc-1010042e6181-multus-daemon-config\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.130608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-sys-fs\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.130883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-cni-bin\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.130979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60c8f519-fd4b-490b-8dd9-e26316afc045-host-slash\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-tuned\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c646c34-edd0-4bb7-ac77-7a47bafd421b-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-log-socket\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2cca38e9-1e0a-4179-87b0-74cf0d052206-konnectivity-ca\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.131441 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-var-lib-openvswitch\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.131895 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-host-run-netns\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.131895 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-etc-sysctl-conf\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.131895 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c646c34-edd0-4bb7-ac77-7a47bafd421b-host-run-netns\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.131895 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e76464-4794-4a6f-bdfc-1010042e6181-etc-kubernetes\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.132063 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.131943 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:48.132063 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.131988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-host\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.132063 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.132043 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:48.632010403 +0000 UTC m=+2.071550959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:48.133570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.133570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p74kx\" (UniqueName: \"kubernetes.io/projected/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kube-api-access-p74kx\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.133570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.133570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cca38e9-1e0a-4179-87b0-74cf0d052206-agent-certs\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.133570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-sys\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.133792 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.133624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-sys\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.134248 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.134224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-tmp\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.134795 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.134758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.136415 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.136399 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:11:48.136784 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.136761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2cca38e9-1e0a-4179-87b0-74cf0d052206-agent-certs\") pod \"konnectivity-agent-npqcs\" (UID: \"2cca38e9-1e0a-4179-87b0-74cf0d052206\") " pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.148537 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.148493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" event={"ID":"ec7d649cd375bc3470d0cc6b98df2f46","Type":"ContainerStarted","Data":"bf5515f96b8868803aed8bb7bf3525297f2b1eecaf0ce9b613e5eb5ec79344bb"} Apr 16 15:11:48.149500 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.149480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" event={"ID":"840c483ab6972a395b52b57432ebf0a1","Type":"ContainerStarted","Data":"5f7ac49a7861317fab1b36c23f5b9614f7fd72b33fa992dabca8b05dec352764"} Apr 16 15:11:48.167321 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.167304 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:48.167321 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.167322 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:48.167321 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.167331 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:48.167484 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.167378 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:48.66736441 +0000 UTC m=+2.106904948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:48.168636 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.168623 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bzffb" Apr 16 15:11:48.169183 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.169159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbwn9\" (UniqueName: \"kubernetes.io/projected/60c8f519-fd4b-490b-8dd9-e26316afc045-kube-api-access-mbwn9\") pod \"iptables-alerter-xtz8k\" (UID: \"60c8f519-fd4b-490b-8dd9-e26316afc045\") " pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.171025 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.171010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4b7k\" (UniqueName: \"kubernetes.io/projected/df3ba38d-f8f3-45ad-90ec-e49f33bed1ff-kube-api-access-t4b7k\") pod \"multus-additional-cni-plugins-k26s9\" (UID: \"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff\") " pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.180611 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.180594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2g2\" (UniqueName: \"kubernetes.io/projected/90e76464-4794-4a6f-bdfc-1010042e6181-kube-api-access-xw2g2\") pod \"multus-zwlx7\" (UID: \"90e76464-4794-4a6f-bdfc-1010042e6181\") " pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.181257 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.181241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p74kx\" (UniqueName: \"kubernetes.io/projected/5c24b1ab-a632-4534-bd41-8a371c1ea7a9-kube-api-access-p74kx\") pod \"aws-ebs-csi-driver-node-4hj8x\" (UID: \"5c24b1ab-a632-4534-bd41-8a371c1ea7a9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.184549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.184535 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bzffb" Apr 16 15:11:48.186018 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.186000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtb7g\" (UniqueName: \"kubernetes.io/projected/8c646c34-edd0-4bb7-ac77-7a47bafd421b-kube-api-access-rtb7g\") pod \"ovnkube-node-8dtn7\" (UID: \"8c646c34-edd0-4bb7-ac77-7a47bafd421b\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.186691 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.186672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksrl\" (UniqueName: \"kubernetes.io/projected/bb22ca70-5a34-4a0e-92c0-b7efb8a7c371-kube-api-access-4ksrl\") pod \"tuned-qpc9j\" (UID: \"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371\") " pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.187502 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.187485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkcz\" (UniqueName: \"kubernetes.io/projected/9973bf97-babd-47b9-a129-38dbed119c77-kube-api-access-jwkcz\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.218344 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.218325 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:48.234069 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74d85d78-e8d8-4b5c-a950-f65047122164-hosts-file\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.234148 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b5389a-1cfb-46bd-bdee-65b24755f000-host\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.234148 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6b5389a-1cfb-46bd-bdee-65b24755f000-serviceca\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.234264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74d85d78-e8d8-4b5c-a950-f65047122164-hosts-file\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.234264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6b5389a-1cfb-46bd-bdee-65b24755f000-host\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.234264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbff\" (UniqueName: \"kubernetes.io/projected/b6b5389a-1cfb-46bd-bdee-65b24755f000-kube-api-access-8mbff\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.234264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d85d78-e8d8-4b5c-a950-f65047122164-tmp-dir\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.234483 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6clqh\" (UniqueName: \"kubernetes.io/projected/74d85d78-e8d8-4b5c-a950-f65047122164-kube-api-access-6clqh\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.234623 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d85d78-e8d8-4b5c-a950-f65047122164-tmp-dir\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.234755 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.234738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6b5389a-1cfb-46bd-bdee-65b24755f000-serviceca\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.257194 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.257147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbff\" (UniqueName: \"kubernetes.io/projected/b6b5389a-1cfb-46bd-bdee-65b24755f000-kube-api-access-8mbff\") pod \"node-ca-thxgc\" (UID: \"b6b5389a-1cfb-46bd-bdee-65b24755f000\") " pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.258221 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.258205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clqh\" (UniqueName: \"kubernetes.io/projected/74d85d78-e8d8-4b5c-a950-f65047122164-kube-api-access-6clqh\") pod \"node-resolver-kjtwg\" (UID: \"74d85d78-e8d8-4b5c-a950-f65047122164\") " pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.327389 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.327367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k26s9" Apr 16 15:11:48.333820 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.333796 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf3ba38d_f8f3_45ad_90ec_e49f33bed1ff.slice/crio-29422be0b5e79d46b5a958edbde30d28acf58f23974a8b1c41691dd896c82698 WatchSource:0}: Error finding container 29422be0b5e79d46b5a958edbde30d28acf58f23974a8b1c41691dd896c82698: Status 404 returned error can't find the container with id 29422be0b5e79d46b5a958edbde30d28acf58f23974a8b1c41691dd896c82698 Apr 16 15:11:48.355013 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.354992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zwlx7" Apr 16 15:11:48.360706 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.360688 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e76464_4794_4a6f_bdfc_1010042e6181.slice/crio-37ac3e7e80ac1f5ff26d3f768a22351219cfcd3b56c4cdbc3e472e2ad675a5bd WatchSource:0}: Error finding container 37ac3e7e80ac1f5ff26d3f768a22351219cfcd3b56c4cdbc3e472e2ad675a5bd: Status 404 returned error can't find the container with id 37ac3e7e80ac1f5ff26d3f768a22351219cfcd3b56c4cdbc3e472e2ad675a5bd Apr 16 15:11:48.374471 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.374454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xtz8k" Apr 16 15:11:48.379819 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.379800 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c8f519_fd4b_490b_8dd9_e26316afc045.slice/crio-9328e1d700748e06d37f862e4673c11f9ae3192a0897229b68959fc4aa2a27c9 WatchSource:0}: Error finding container 9328e1d700748e06d37f862e4673c11f9ae3192a0897229b68959fc4aa2a27c9: Status 404 returned error can't find the container with id 9328e1d700748e06d37f862e4673c11f9ae3192a0897229b68959fc4aa2a27c9 Apr 16 15:11:48.389835 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.389816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:11:48.394933 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.394914 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cca38e9_1e0a_4179_87b0_74cf0d052206.slice/crio-284847e403f0f2b0ce1d3eaa046308f48f382c91331dfb9d848614b57eb871dc WatchSource:0}: Error finding container 284847e403f0f2b0ce1d3eaa046308f48f382c91331dfb9d848614b57eb871dc: Status 404 returned error can't find the container with id 284847e403f0f2b0ce1d3eaa046308f48f382c91331dfb9d848614b57eb871dc Apr 16 15:11:48.408513 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.408495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" Apr 16 15:11:48.413531 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.413511 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c24b1ab_a632_4534_bd41_8a371c1ea7a9.slice/crio-a00b3ff365c7fcb18acbdf395a24b5bdad8b98fbb290be578674f017d54e9710 WatchSource:0}: Error finding container a00b3ff365c7fcb18acbdf395a24b5bdad8b98fbb290be578674f017d54e9710: Status 404 returned error can't find the container with id a00b3ff365c7fcb18acbdf395a24b5bdad8b98fbb290be578674f017d54e9710 Apr 16 15:11:48.423516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.423499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" Apr 16 15:11:48.428414 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.428395 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb22ca70_5a34_4a0e_92c0_b7efb8a7c371.slice/crio-df4347ba32e962c89566a68ba3c73319a9ee3ad8453a76997b0b57555c40a8f4 WatchSource:0}: Error finding container df4347ba32e962c89566a68ba3c73319a9ee3ad8453a76997b0b57555c40a8f4: Status 404 returned error can't find the container with id df4347ba32e962c89566a68ba3c73319a9ee3ad8453a76997b0b57555c40a8f4 Apr 16 15:11:48.439414 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.439398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:11:48.441710 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.441692 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:48.444500 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.444482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjtwg" Apr 16 15:11:48.444673 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.444654 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c646c34_edd0_4bb7_ac77_7a47bafd421b.slice/crio-c60b92c0dd0d548d1aba31f3e7d93bb7a167fe4b14081779cbbb3e3cb562ff9a WatchSource:0}: Error finding container c60b92c0dd0d548d1aba31f3e7d93bb7a167fe4b14081779cbbb3e3cb562ff9a: Status 404 returned error can't find the container with id c60b92c0dd0d548d1aba31f3e7d93bb7a167fe4b14081779cbbb3e3cb562ff9a Apr 16 15:11:48.449455 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.449434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-thxgc" Apr 16 15:11:48.449667 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.449650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d85d78_e8d8_4b5c_a950_f65047122164.slice/crio-fe955c6c7096e52778ab04947842cbd226df61729b4bec3efe9f88b1301d6519 WatchSource:0}: Error finding container fe955c6c7096e52778ab04947842cbd226df61729b4bec3efe9f88b1301d6519: Status 404 returned error can't find the container with id fe955c6c7096e52778ab04947842cbd226df61729b4bec3efe9f88b1301d6519 Apr 16 15:11:48.454652 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:11:48.454633 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b5389a_1cfb_46bd_bdee_65b24755f000.slice/crio-c432e1c04852a753e8740f9397db85813f0378ad5a12dedcf66037e0c895d76e WatchSource:0}: Error finding container c432e1c04852a753e8740f9397db85813f0378ad5a12dedcf66037e0c895d76e: Status 404 returned error can't find the container with id c432e1c04852a753e8740f9397db85813f0378ad5a12dedcf66037e0c895d76e Apr 16 15:11:48.637244 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.637161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:48.637382 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.637268 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:48.637382 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.637325 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:49.637308399 +0000 UTC m=+3.076848956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:48.737769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.737596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:48.737946 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.737787 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:48.737946 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.737806 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:48.737946 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.737818 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:48.737946 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:48.737877 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:49.737859479 +0000 UTC m=+3.177400021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:48.931983 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:48.931908 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 15:11:49.149161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.148988 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:49.149161 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.149122 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:49.158864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.158829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" event={"ID":"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371","Type":"ContainerStarted","Data":"df4347ba32e962c89566a68ba3c73319a9ee3ad8453a76997b0b57555c40a8f4"} Apr 16 15:11:49.176606 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.176574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-npqcs" event={"ID":"2cca38e9-1e0a-4179-87b0-74cf0d052206","Type":"ContainerStarted","Data":"284847e403f0f2b0ce1d3eaa046308f48f382c91331dfb9d848614b57eb871dc"} Apr 16 15:11:49.185675 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.185607 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:06:48 +0000 UTC" deadline="2027-10-03 14:15:02.090056676 +0000 UTC" Apr 16 15:11:49.185675 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.185645 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12839h3m12.904415477s" Apr 16 15:11:49.187897 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.187873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xtz8k" event={"ID":"60c8f519-fd4b-490b-8dd9-e26316afc045","Type":"ContainerStarted","Data":"9328e1d700748e06d37f862e4673c11f9ae3192a0897229b68959fc4aa2a27c9"} Apr 16 15:11:49.211134 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.211105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlx7" event={"ID":"90e76464-4794-4a6f-bdfc-1010042e6181","Type":"ContainerStarted","Data":"37ac3e7e80ac1f5ff26d3f768a22351219cfcd3b56c4cdbc3e472e2ad675a5bd"} Apr 16 15:11:49.221549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.221524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerStarted","Data":"29422be0b5e79d46b5a958edbde30d28acf58f23974a8b1c41691dd896c82698"} Apr 16 15:11:49.259444 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.259327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-thxgc" event={"ID":"b6b5389a-1cfb-46bd-bdee-65b24755f000","Type":"ContainerStarted","Data":"c432e1c04852a753e8740f9397db85813f0378ad5a12dedcf66037e0c895d76e"} Apr 16 15:11:49.277256 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.277228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjtwg" event={"ID":"74d85d78-e8d8-4b5c-a950-f65047122164","Type":"ContainerStarted","Data":"fe955c6c7096e52778ab04947842cbd226df61729b4bec3efe9f88b1301d6519"} Apr 16 15:11:49.300292 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.300007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"c60b92c0dd0d548d1aba31f3e7d93bb7a167fe4b14081779cbbb3e3cb562ff9a"} Apr 16 15:11:49.318244 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.318187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" event={"ID":"5c24b1ab-a632-4534-bd41-8a371c1ea7a9","Type":"ContainerStarted","Data":"a00b3ff365c7fcb18acbdf395a24b5bdad8b98fbb290be578674f017d54e9710"} Apr 16 15:11:49.644464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.644371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:49.644624 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.644550 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:49.644624 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.644618 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:51.644599262 +0000 UTC m=+5.084139806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:49.744775 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:49.744737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:49.744955 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.744920 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:49.744955 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.744954 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:49.745056 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.744969 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:49.745056 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:49.745026 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:51.745007979 +0000 UTC m=+5.184548522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:50.147174 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:50.146669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:50.147174 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:50.146810 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:50.185867 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:50.185827 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:06:48 +0000 UTC" deadline="2028-01-13 09:45:47.260359344 +0000 UTC" Apr 16 15:11:50.185867 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:50.185863 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15282h33m57.074500486s" Apr 16 15:11:51.148627 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:51.148600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:51.149090 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.148717 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:51.660762 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:51.660656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:51.660964 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.660848 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:51.660964 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.660910 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:55.66089687 +0000 UTC m=+9.100437408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:51.761617 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:51.761582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:51.761804 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.761769 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:51.761804 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.761789 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:51.761804 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.761801 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:51.761962 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:51.761860 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:11:55.76184158 +0000 UTC m=+9.201382132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:52.147463 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:52.147410 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:52.147661 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:52.147573 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:53.148443 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:53.147312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:53.148443 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:53.148021 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:54.147759 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:54.147442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:54.147759 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:54.147586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:55.147286 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:55.146916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:55.147286 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.147039 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:55.692965 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:55.692473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:55.692965 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.692616 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:55.692965 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.692686 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:03.69266955 +0000 UTC m=+17.132210105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:11:55.793458 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:55.793402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:55.793625 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.793601 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:11:55.793625 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.793619 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:11:55.793732 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.793631 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:55.793732 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:55.793685 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:03.793668538 +0000 UTC m=+17.233209081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:11:56.147083 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:56.146997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:56.147234 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:56.147138 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:57.147781 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:57.147692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:57.148165 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:57.147820 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:11:58.146914 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:58.146881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:11:58.147102 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:58.147033 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:11:59.147184 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:11:59.147149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:11:59.147622 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:11:59.147282 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:00.146882 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:00.146853 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:00.147064 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:00.146965 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:01.147309 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:01.147277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:01.147761 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:01.147411 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:02.146382 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:02.146295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:02.146599 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:02.146411 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:03.149211 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.149181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:03.149608 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.149289 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:03.604993 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.604908 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ghhxd"] Apr 16 15:12:03.617658 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.617632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.617770 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.617713 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:03.753542 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.753491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-kubelet-config\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.753542 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.753533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-dbus\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.753753 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.753559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.753753 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.753636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:03.753753 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.753746 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:03.753913 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.753806 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.753788102 +0000 UTC m=+33.193328643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:03.854766 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.854735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:03.854947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.854791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-kubelet-config\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.854947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.854822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-dbus\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.854947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.854850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.854947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.854866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-kubelet-config\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.854972 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.854996 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.855009 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:03.855033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6fac2453-74e6-4f70-8221-26f46efaa1a5-dbus\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.855041 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.855067 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.855046232 +0000 UTC m=+33.294586772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:03.855175 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:03.855110 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:04.355093171 +0000 UTC m=+17.794633726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:04.147059 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:04.146974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:04.147229 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:04.147103 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:04.359075 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:04.359042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:04.359530 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:04.359161 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:04.359530 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:04.359227 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:05.359208316 +0000 UTC m=+18.798748889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:05.147460 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:05.147340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:05.147652 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:05.147494 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:05.147747 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:05.147728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:05.147864 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:05.147840 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:05.365616 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:05.365584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:05.366048 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:05.365755 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:05.366048 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:05.365830 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:07.365810204 +0000 UTC m=+20.805350852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:06.147351 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:06.147323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:06.147502 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:06.147458 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:07.148228 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.147434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:07.148228 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.147435 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:07.148228 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:07.147985 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:07.148228 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:07.148062 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:07.353706 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.353665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" event={"ID":"bb22ca70-5a34-4a0e-92c0-b7efb8a7c371","Type":"ContainerStarted","Data":"9ef5d754989eda42cb0f8b5539e10c080065c000198d5e6a13178b561651bdc2"} Apr 16 15:12:07.355445 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.355371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-npqcs" event={"ID":"2cca38e9-1e0a-4179-87b0-74cf0d052206","Type":"ContainerStarted","Data":"bc49c8b07019ce0b37ae32197615f3dc086c3cd85c8287c4f4fb10d820440bf1"} Apr 16 15:12:07.357156 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.357120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zwlx7" event={"ID":"90e76464-4794-4a6f-bdfc-1010042e6181","Type":"ContainerStarted","Data":"bcdb76ff95404c93896eaea23db1548f5ef1ed4858cfe0eb36c8ff193417b275"} Apr 16 15:12:07.358670 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.358543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerStarted","Data":"5848095866b42d75e0433435961b329d446ff50b928d2ce92a8d7589bd3fd15e"} Apr 16 15:12:07.359761 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.359741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-thxgc" event={"ID":"b6b5389a-1cfb-46bd-bdee-65b24755f000","Type":"ContainerStarted","Data":"ad15fdce7167b2c8749812ea4d645634f7dc67a335367903c943890a4a365a49"} Apr 16 15:12:07.361282 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.361247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjtwg" event={"ID":"74d85d78-e8d8-4b5c-a950-f65047122164","Type":"ContainerStarted","Data":"8f4caff48b5ea7150ed33a0810df60510e976736819a558f77b4dbdcc603f967"} Apr 16 15:12:07.363664 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.363649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:12:07.363903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.363886 2575 generic.go:358] "Generic (PLEG): container finished" podID="8c646c34-edd0-4bb7-ac77-7a47bafd421b" containerID="dfb813ad969318370424ebaba9b8608004ff1497a863c1394c804bb0e4730856" exitCode=1 Apr 16 15:12:07.363973 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.363952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"82e756c1b215f46717ae009575142fa62cf411c0203cf028f8efd1ae84f231d1"} Apr 16 15:12:07.364076 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.363978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"1455effe58ad2fbc0d09366934657671676bca2e11f0262064791fd3e23b8b94"} Apr 16 15:12:07.364076 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.363993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"805cf088c046729f2f199a2f0c3989a5ab59c5abe4beef5981a4adc88ff7d1ca"} Apr 16 15:12:07.364076 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.364014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"27413a618c9711f8d084605f093d25f51a69a85ff2a584598007594a7557e30e"} Apr 16 15:12:07.364076 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.364028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerDied","Data":"dfb813ad969318370424ebaba9b8608004ff1497a863c1394c804bb0e4730856"} Apr 16 15:12:07.364076 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.364041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"ff1ba6068d21cac9d04bc70ee0ecb0e8c83a92e9d005bea500a27774e895b209"} Apr 16 15:12:07.365007 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.364987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" event={"ID":"5c24b1ab-a632-4534-bd41-8a371c1ea7a9","Type":"ContainerStarted","Data":"6b3239ede427d712ab1f05f096c17bae0edcab6bdb78d512ec395ae6f37d933f"} Apr 16 15:12:07.366592 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.366573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" event={"ID":"ec7d649cd375bc3470d0cc6b98df2f46","Type":"ContainerStarted","Data":"9bbbfa718bcb0256fd7515ab3ca87ef9b4513830c9efbb759db8d61a71efcf07"} Apr 16 15:12:07.368048 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.368029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" event={"ID":"840c483ab6972a395b52b57432ebf0a1","Type":"ContainerStarted","Data":"b43610204f1140da8e32e3635ceb400b2ca392463f7f5153c8f47d2146eb7c35"} Apr 16 15:12:07.380637 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.380616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:07.380737 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:07.380723 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:07.380802 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:07.380784 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:11.380766546 +0000 UTC m=+24.820307088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:07.438297 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.438247 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qpc9j" podStartSLOduration=2.343567685 podStartE2EDuration="20.438233471s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.429839849 +0000 UTC m=+1.869380386" lastFinishedPulling="2026-04-16 15:12:06.524505624 +0000 UTC m=+19.964046172" observedRunningTime="2026-04-16 15:12:07.410854377 +0000 UTC m=+20.850394938" watchObservedRunningTime="2026-04-16 15:12:07.438233471 +0000 UTC m=+20.877774066" Apr 16 15:12:07.529679 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.529500 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-254.ec2.internal" podStartSLOduration=19.529487744 podStartE2EDuration="19.529487744s" podCreationTimestamp="2026-04-16 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:12:07.476699943 +0000 UTC m=+20.916240502" watchObservedRunningTime="2026-04-16 15:12:07.529487744 +0000 UTC m=+20.969028321" Apr 16 15:12:07.618725 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.618676 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zwlx7" podStartSLOduration=2.4538982750000002 podStartE2EDuration="20.618659285s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.362049491 +0000 UTC m=+1.801590034" lastFinishedPulling="2026-04-16 15:12:06.526810482 +0000 UTC m=+19.966351044" observedRunningTime="2026-04-16 15:12:07.618584895 +0000 UTC m=+21.058125480" watchObservedRunningTime="2026-04-16 15:12:07.618659285 +0000 UTC m=+21.058199846" Apr 16 15:12:07.740177 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.740122 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-npqcs" podStartSLOduration=2.663704974 podStartE2EDuration="20.740105281s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.396276567 +0000 UTC m=+1.835817105" lastFinishedPulling="2026-04-16 15:12:06.472676861 +0000 UTC m=+19.912217412" observedRunningTime="2026-04-16 15:12:07.672645018 +0000 UTC m=+21.112185592" watchObservedRunningTime="2026-04-16 15:12:07.740105281 +0000 UTC m=+21.179645832" Apr 16 15:12:07.740764 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:07.740733 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kjtwg" podStartSLOduration=2.666355023 podStartE2EDuration="20.740719455s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.451893777 +0000 UTC m=+1.891434319" lastFinishedPulling="2026-04-16 15:12:06.526258208 +0000 UTC m=+19.965798751" observedRunningTime="2026-04-16 15:12:07.74021287 +0000 UTC m=+21.179753442" watchObservedRunningTime="2026-04-16 15:12:07.740719455 +0000 UTC m=+21.180260019" Apr 16 15:12:08.118736 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.118708 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 15:12:08.147319 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.147295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:08.147410 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:08.147392 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:08.370937 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.370861 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="5848095866b42d75e0433435961b329d446ff50b928d2ce92a8d7589bd3fd15e" exitCode=0 Apr 16 15:12:08.370937 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.370928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"5848095866b42d75e0433435961b329d446ff50b928d2ce92a8d7589bd3fd15e"} Apr 16 15:12:08.372566 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.372540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" event={"ID":"5c24b1ab-a632-4534-bd41-8a371c1ea7a9","Type":"ContainerStarted","Data":"dd30edc93e5b7e8feffbb80ac342708323701e9b7ebd73359874c5155d3f6d64"} Apr 16 15:12:08.373994 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.373975 2575 generic.go:358] "Generic (PLEG): container finished" podID="ec7d649cd375bc3470d0cc6b98df2f46" containerID="9bbbfa718bcb0256fd7515ab3ca87ef9b4513830c9efbb759db8d61a71efcf07" exitCode=0 Apr 16 15:12:08.374056 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.374037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" event={"ID":"ec7d649cd375bc3470d0cc6b98df2f46","Type":"ContainerDied","Data":"9bbbfa718bcb0256fd7515ab3ca87ef9b4513830c9efbb759db8d61a71efcf07"} Apr 16 15:12:08.374094 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.374054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" event={"ID":"ec7d649cd375bc3470d0cc6b98df2f46","Type":"ContainerStarted","Data":"94d96028a715e69318cee88ca993b886de99c8f833ec3f09ca38893ea6ac848c"} Apr 16 15:12:08.375274 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.375241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xtz8k" event={"ID":"60c8f519-fd4b-490b-8dd9-e26316afc045","Type":"ContainerStarted","Data":"303400b95eb2c87bc3254540170571f8d921da09e93e8c923be2fe26345c8966"} Apr 16 15:12:08.411245 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.411209 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-thxgc" podStartSLOduration=3.394358421 podStartE2EDuration="21.411197477s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.455805385 +0000 UTC m=+1.895345923" lastFinishedPulling="2026-04-16 15:12:06.472644427 +0000 UTC m=+19.912184979" observedRunningTime="2026-04-16 15:12:07.77859152 +0000 UTC m=+21.218132079" watchObservedRunningTime="2026-04-16 15:12:08.411197477 +0000 UTC m=+21.850738035" Apr 16 15:12:08.519849 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.519810 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xtz8k" podStartSLOduration=3.377133521 podStartE2EDuration="21.519798195s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.381460382 +0000 UTC m=+1.821000921" lastFinishedPulling="2026-04-16 15:12:06.524125042 +0000 UTC m=+19.963665595" observedRunningTime="2026-04-16 15:12:08.461665357 +0000 UTC m=+21.901205917" watchObservedRunningTime="2026-04-16 15:12:08.519798195 +0000 UTC m=+21.959338754" Apr 16 15:12:08.710171 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.710099 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:12:08.710911 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.710890 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:12:08.743111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:08.743054 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-254.ec2.internal" podStartSLOduration=20.743037327 podStartE2EDuration="20.743037327s" podCreationTimestamp="2026-04-16 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:12:08.51987887 +0000 UTC m=+21.959419430" watchObservedRunningTime="2026-04-16 15:12:08.743037327 +0000 UTC m=+22.182577888" Apr 16 15:12:09.087710 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.087606 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T15:12:08.118731759Z","UUID":"1e2c5f57-192d-4aef-ad4c-284856f26d4a","Handler":null,"Name":"","Endpoint":""} Apr 16 15:12:09.090331 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.089905 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 15:12:09.090331 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.089938 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 15:12:09.146824 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.146799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:09.146939 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:09.146911 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:09.147397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.147373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:09.147541 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:09.147518 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:09.380062 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.379990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:12:09.380499 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.380355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"cdd3c59512aa2016b7ce24df81d24968499bf92463dfd3fd519664398b308713"} Apr 16 15:12:09.382405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.382346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" event={"ID":"5c24b1ab-a632-4534-bd41-8a371c1ea7a9","Type":"ContainerStarted","Data":"8d114a3c55e0b8c8fc3fb14d0bb25a11a7338eb23533f150b6319ea63eaf112a"} Apr 16 15:12:09.382539 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.382523 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:12:09.382995 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.382976 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-npqcs" Apr 16 15:12:09.484472 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:09.484408 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hj8x" podStartSLOduration=1.817107791 podStartE2EDuration="22.484396085s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.414891014 +0000 UTC m=+1.854431553" lastFinishedPulling="2026-04-16 15:12:09.082179291 +0000 UTC m=+22.521719847" observedRunningTime="2026-04-16 15:12:09.427142203 +0000 UTC m=+22.866682773" watchObservedRunningTime="2026-04-16 15:12:09.484396085 +0000 UTC m=+22.923936644" Apr 16 15:12:10.147189 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:10.147159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:10.147357 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:10.147273 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:11.146719 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:11.146689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:11.147214 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:11.146813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:11.147214 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:11.146873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:11.147214 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:11.146980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:11.409773 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:11.409692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:11.409917 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:11.409829 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:11.409917 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:11.409891 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:19.409872584 +0000 UTC m=+32.849413130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:12.146715 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.146577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:12.146845 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:12.146825 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:12.390984 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.390963 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:12:12.391586 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.391562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"68b8b114e95ff90e55e6863351c1b123e0b12028d6d65711798a1d1365f9cc4c"} Apr 16 15:12:12.391864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.391847 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:12.391996 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.391980 2575 scope.go:117] "RemoveContainer" containerID="dfb813ad969318370424ebaba9b8608004ff1497a863c1394c804bb0e4730856" Apr 16 15:12:12.393347 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.393327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerStarted","Data":"53f6adf6dfa468db89f17b112f72ffeaf4454c7e11fb226e64462da9e11142bc"} Apr 16 15:12:12.406538 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:12.406522 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:13.147187 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.147151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:13.147769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.147185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:13.147769 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:13.147278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:13.147769 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:13.147392 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:13.397229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.397135 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="53f6adf6dfa468db89f17b112f72ffeaf4454c7e11fb226e64462da9e11142bc" exitCode=0 Apr 16 15:12:13.397229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.397160 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="d01bff8768ce1de73198f83cd09622d49050ee7a39d7cdc68f54de693a7d9aad" exitCode=0 Apr 16 15:12:13.397438 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.397225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"53f6adf6dfa468db89f17b112f72ffeaf4454c7e11fb226e64462da9e11142bc"} Apr 16 15:12:13.397438 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.397266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"d01bff8768ce1de73198f83cd09622d49050ee7a39d7cdc68f54de693a7d9aad"} Apr 16 15:12:13.400987 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.400971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:12:13.401354 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.401330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" event={"ID":"8c646c34-edd0-4bb7-ac77-7a47bafd421b","Type":"ContainerStarted","Data":"7dcbfe802adf03ad0a3977fd81ce5e5e8d0505f2e37cc08a6c14633927084a06"} Apr 16 15:12:13.401708 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.401689 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:13.401801 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.401715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:13.421312 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.421289 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:13.521672 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:13.521618 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" podStartSLOduration=8.32315647 podStartE2EDuration="26.521604329s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.447144033 +0000 UTC m=+1.886684571" lastFinishedPulling="2026-04-16 15:12:06.64559189 +0000 UTC m=+20.085132430" observedRunningTime="2026-04-16 15:12:13.516533892 +0000 UTC m=+26.956074454" watchObservedRunningTime="2026-04-16 15:12:13.521604329 +0000 UTC m=+26.961144871" Apr 16 15:12:14.146961 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.146760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:14.147109 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:14.147080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:14.407080 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.406945 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="f9feac32467472d6a373501f2d3d0dc7b976eca9c1e1549fff16ac90ea2ba490" exitCode=0 Apr 16 15:12:14.407080 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.407020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"f9feac32467472d6a373501f2d3d0dc7b976eca9c1e1549fff16ac90ea2ba490"} Apr 16 15:12:14.432837 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.432817 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5d795"] Apr 16 15:12:14.432947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.432938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:14.433027 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:14.433009 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:14.433327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.433311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-whwdh"] Apr 16 15:12:14.433385 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.433375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:14.433481 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:14.433466 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:14.439384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.439365 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ghhxd"] Apr 16 15:12:14.439488 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:14.439473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:14.439576 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:14.439558 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:16.146901 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:16.146829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:16.147334 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:16.146829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:16.147334 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:16.146928 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:16.147334 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:16.147001 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:16.147334 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:16.146833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:16.147334 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:16.147075 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:18.146906 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:18.146873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:18.147327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:18.146957 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:18.147327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:18.147052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:18.147327 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:18.147079 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ghhxd" podUID="6fac2453-74e6-4f70-8221-26f46efaa1a5" Apr 16 15:12:18.147327 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:18.147136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5d795" podUID="8de53a64-cdc0-4735-a754-56ab12a8afc1" Apr 16 15:12:18.147327 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:18.147210 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:12:19.470154 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:19.470115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:19.470607 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.470282 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:19.470607 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.470350 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret podName:6fac2453-74e6-4f70-8221-26f46efaa1a5 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:35.470324817 +0000 UTC m=+48.909865370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret") pod "global-pull-secret-syncer-ghhxd" (UID: "6fac2453-74e6-4f70-8221-26f46efaa1a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 15:12:19.773059 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:19.773023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:19.773270 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.773169 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:19.773270 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.773242 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:51.773222321 +0000 UTC m=+65.212762865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 15:12:19.874298 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:19.874266 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:19.874487 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.874414 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 15:12:19.874487 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.874451 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 15:12:19.874487 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.874463 2575 projected.go:194] Error preparing data for projected volume kube-api-access-sss5d for pod openshift-network-diagnostics/network-check-target-5d795: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:19.874652 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:19.874524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d podName:8de53a64-cdc0-4735-a754-56ab12a8afc1 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:51.874504762 +0000 UTC m=+65.314045315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sss5d" (UniqueName: "kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d") pod "network-check-target-5d795" (UID: "8de53a64-cdc0-4735-a754-56ab12a8afc1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 15:12:19.899188 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:19.899169 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-254.ec2.internal" event="NodeReady" Apr 16 15:12:19.899310 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:19.899302 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 15:12:20.001646 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.001620 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jrg65"] Apr 16 15:12:20.020469 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.020447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.024545 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.024486 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pm67f"] Apr 16 15:12:20.026736 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.026713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 15:12:20.026838 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.026780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 15:12:20.026838 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.026794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 15:12:20.026923 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.026789 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:12:20.036447 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.036414 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrg65"] Apr 16 15:12:20.036540 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.036527 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.042606 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.042592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:12:20.042791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.042779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 15:12:20.047848 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.047834 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 15:12:20.058432 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.058401 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pm67f"] Apr 16 15:12:20.146567 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.146543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:20.146696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.146582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:20.146790 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.146772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:20.153464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.153448 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 15:12:20.154519 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.154502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:12:20.154587 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.154502 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:12:20.154656 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.154642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:12:20.154693 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.154642 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 15:12:20.162321 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.162306 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:12:20.175584 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstd5\" (UniqueName: \"kubernetes.io/projected/29f4a0db-de17-476d-97ad-df37fd2a5065-kube-api-access-rstd5\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.175683 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f4a0db-de17-476d-97ad-df37fd2a5065-config-volume\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.175683 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.175791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgpw\" (UniqueName: \"kubernetes.io/projected/d01959bc-3d04-456b-9dbe-ea153e10fa05-kube-api-access-zmgpw\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.175791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f4a0db-de17-476d-97ad-df37fd2a5065-tmp-dir\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.175791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.175776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.276363 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.276363 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rstd5\" (UniqueName: \"kubernetes.io/projected/29f4a0db-de17-476d-97ad-df37fd2a5065-kube-api-access-rstd5\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.276532 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.276461 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:20.276532 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f4a0db-de17-476d-97ad-df37fd2a5065-config-volume\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.276532 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.276514 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.776495272 +0000 UTC m=+34.216035812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:20.276680 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.276680 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgpw\" (UniqueName: \"kubernetes.io/projected/d01959bc-3d04-456b-9dbe-ea153e10fa05-kube-api-access-zmgpw\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.276680 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.276648 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:20.276815 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f4a0db-de17-476d-97ad-df37fd2a5065-tmp-dir\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.276815 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.276766 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:20.776742048 +0000 UTC m=+34.216282588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:20.276959 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.276944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/29f4a0db-de17-476d-97ad-df37fd2a5065-tmp-dir\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.277044 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.277027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f4a0db-de17-476d-97ad-df37fd2a5065-config-volume\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.300020 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.299874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstd5\" (UniqueName: \"kubernetes.io/projected/29f4a0db-de17-476d-97ad-df37fd2a5065-kube-api-access-rstd5\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.300103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.300055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgpw\" (UniqueName: \"kubernetes.io/projected/d01959bc-3d04-456b-9dbe-ea153e10fa05-kube-api-access-zmgpw\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.780272 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.780240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:20.780623 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:20.780303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:20.780623 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.780378 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:20.780623 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.780453 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:21.780440113 +0000 UTC m=+35.219980655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:20.780623 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.780380 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:20.780623 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:20.780510 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:21.78049752 +0000 UTC m=+35.220038059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:21.423432 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:21.423380 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="b47ba4df5337c7e39bbf2f597cbbd95155ae2e1b4556d1e268858832f125255b" exitCode=0 Apr 16 15:12:21.423586 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:21.423456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"b47ba4df5337c7e39bbf2f597cbbd95155ae2e1b4556d1e268858832f125255b"} Apr 16 15:12:21.787922 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:21.787890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:21.788325 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:21.787946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:21.788325 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:21.788036 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:21.788325 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:21.788036 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:21.788325 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:21.788084 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:23.788072194 +0000 UTC m=+37.227612733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:21.788325 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:21.788097 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:23.788091618 +0000 UTC m=+37.227632156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:22.428065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:22.428035 2575 generic.go:358] "Generic (PLEG): container finished" podID="df3ba38d-f8f3-45ad-90ec-e49f33bed1ff" containerID="a87fdc929a7f75d18cf704f81f6db5e76d9601066a56b5833d05af4e39c9674d" exitCode=0 Apr 16 15:12:22.428217 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:22.428074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerDied","Data":"a87fdc929a7f75d18cf704f81f6db5e76d9601066a56b5833d05af4e39c9674d"} Apr 16 15:12:23.432218 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:23.432190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k26s9" event={"ID":"df3ba38d-f8f3-45ad-90ec-e49f33bed1ff","Type":"ContainerStarted","Data":"558b9e56164262dd73304ce5e15ee7a1ae0cc910fa6a89ddaa439ac37064211c"} Apr 16 15:12:23.504034 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:23.503995 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k26s9" podStartSLOduration=4.522517616 podStartE2EDuration="36.503980807s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:11:48.33539753 +0000 UTC m=+1.774938069" lastFinishedPulling="2026-04-16 15:12:20.316860722 +0000 UTC m=+33.756401260" observedRunningTime="2026-04-16 15:12:23.502579373 +0000 UTC m=+36.942119934" watchObservedRunningTime="2026-04-16 15:12:23.503980807 +0000 UTC m=+36.943521366" Apr 16 15:12:23.801396 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:23.801373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:23.801501 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:23.801440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:23.801546 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:23.801510 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:23.801625 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:23.801568 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:27.801552876 +0000 UTC m=+41.241093419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:23.801675 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:23.801511 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:23.801675 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:23.801670 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:27.801656684 +0000 UTC m=+41.241197236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:27.827893 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:27.827861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:27.828342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:27.827916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:27.828342 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:27.828017 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:27.828342 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:27.828074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:35.828060136 +0000 UTC m=+49.267600678 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:27.828342 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:27.828016 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:27.828342 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:27.828141 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:35.828130266 +0000 UTC m=+49.267670803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:35.478497 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.478462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:35.481090 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.481066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6fac2453-74e6-4f70-8221-26f46efaa1a5-original-pull-secret\") pod \"global-pull-secret-syncer-ghhxd\" (UID: \"6fac2453-74e6-4f70-8221-26f46efaa1a5\") " pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:35.756137 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.756066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ghhxd" Apr 16 15:12:35.880909 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.880881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:35.881043 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.880940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:35.881104 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:35.881058 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:35.881104 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:35.881082 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:35.881203 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:35.881113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:51.881097367 +0000 UTC m=+65.320637910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:35.881203 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:35.881168 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:12:51.881144697 +0000 UTC m=+65.320685252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:35.913481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:35.913452 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ghhxd"] Apr 16 15:12:35.917187 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:12:35.917158 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fac2453_74e6_4f70_8221_26f46efaa1a5.slice/crio-63865f1ea5fa4a83735b09fa656cc5882811ef61cac290285c7e9198d1336b89 WatchSource:0}: Error finding container 63865f1ea5fa4a83735b09fa656cc5882811ef61cac290285c7e9198d1336b89: Status 404 returned error can't find the container with id 63865f1ea5fa4a83735b09fa656cc5882811ef61cac290285c7e9198d1336b89 Apr 16 15:12:36.456033 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:36.456002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ghhxd" event={"ID":"6fac2453-74e6-4f70-8221-26f46efaa1a5","Type":"ContainerStarted","Data":"63865f1ea5fa4a83735b09fa656cc5882811ef61cac290285c7e9198d1336b89"} Apr 16 15:12:40.464335 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:40.464296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ghhxd" event={"ID":"6fac2453-74e6-4f70-8221-26f46efaa1a5","Type":"ContainerStarted","Data":"e832db88d27893d1f52e9657f1d6ffaaf83571b0d774ebc6d3df4a03e9d134fb"} Apr 16 15:12:40.485109 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:40.485065 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ghhxd" podStartSLOduration=33.285296746 podStartE2EDuration="37.485052937s" podCreationTimestamp="2026-04-16 15:12:03 +0000 UTC" firstStartedPulling="2026-04-16 15:12:35.919152748 +0000 UTC m=+49.358693285" lastFinishedPulling="2026-04-16 15:12:40.118908938 +0000 UTC m=+53.558449476" observedRunningTime="2026-04-16 15:12:40.484775479 +0000 UTC m=+53.924316038" watchObservedRunningTime="2026-04-16 15:12:40.485052937 +0000 UTC m=+53.924593475" Apr 16 15:12:45.418961 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:45.418931 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtn7" Apr 16 15:12:51.785500 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.785458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:12:51.788495 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.788477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 15:12:51.796030 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.796013 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:12:51.796104 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.796077 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:55.796059345 +0000 UTC m=+129.235599882 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : secret "metrics-daemon-secret" not found Apr 16 15:12:51.885962 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.885938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:12:51.886063 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.885967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:51.886063 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.885994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:12:51.886140 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.886073 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:12:51.886140 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.886108 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:23.886097649 +0000 UTC m=+97.325638188 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:12:51.886225 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.886072 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:12:51.886225 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:12:51.886185 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:13:23.886171634 +0000 UTC m=+97.325712172 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:12:51.888981 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.888967 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 15:12:51.899674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.899657 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 15:12:51.909429 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.909392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sss5d\" (UniqueName: \"kubernetes.io/projected/8de53a64-cdc0-4735-a754-56ab12a8afc1-kube-api-access-sss5d\") pod \"network-check-target-5d795\" (UID: \"8de53a64-cdc0-4735-a754-56ab12a8afc1\") " pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:51.968925 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.968906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xkvgm\"" Apr 16 15:12:51.977244 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:51.977229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:52.088330 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:52.088299 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5d795"] Apr 16 15:12:52.091519 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:12:52.091488 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de53a64_cdc0_4735_a754_56ab12a8afc1.slice/crio-42af474779c2eda55d2cd262a2605b35eeb1c1fd29dec3d83d53877a5622d73a WatchSource:0}: Error finding container 42af474779c2eda55d2cd262a2605b35eeb1c1fd29dec3d83d53877a5622d73a: Status 404 returned error can't find the container with id 42af474779c2eda55d2cd262a2605b35eeb1c1fd29dec3d83d53877a5622d73a Apr 16 15:12:52.484554 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:52.484521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5d795" event={"ID":"8de53a64-cdc0-4735-a754-56ab12a8afc1","Type":"ContainerStarted","Data":"42af474779c2eda55d2cd262a2605b35eeb1c1fd29dec3d83d53877a5622d73a"} Apr 16 15:12:55.492282 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:55.492246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5d795" event={"ID":"8de53a64-cdc0-4735-a754-56ab12a8afc1","Type":"ContainerStarted","Data":"06d3bd39dab5d978c1ea36000a49b653311dfbef6bd3e21d41528ebebb135c91"} Apr 16 15:12:55.492657 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:55.492404 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:12:55.511865 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:12:55.511816 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5d795" podStartSLOduration=65.887747915 podStartE2EDuration="1m8.511801061s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:12:52.093287146 +0000 UTC m=+65.532827691" lastFinishedPulling="2026-04-16 15:12:54.717340295 +0000 UTC m=+68.156880837" observedRunningTime="2026-04-16 15:12:55.511444244 +0000 UTC m=+68.950984802" watchObservedRunningTime="2026-04-16 15:12:55.511801061 +0000 UTC m=+68.951341622" Apr 16 15:13:23.896974 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:13:23.896939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:13:23.897362 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:13:23.896985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:13:23.897362 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:23.897077 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 15:13:23.897362 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:23.897078 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 15:13:23.897362 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:23.897136 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert podName:d01959bc-3d04-456b-9dbe-ea153e10fa05 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:27.897121405 +0000 UTC m=+161.336661942 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert") pod "ingress-canary-jrg65" (UID: "d01959bc-3d04-456b-9dbe-ea153e10fa05") : secret "canary-serving-cert" not found Apr 16 15:13:23.897362 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:23.897150 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls podName:29f4a0db-de17-476d-97ad-df37fd2a5065 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:27.897144631 +0000 UTC m=+161.336685169 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls") pod "dns-default-pm67f" (UID: "29f4a0db-de17-476d-97ad-df37fd2a5065") : secret "dns-default-metrics-tls" not found Apr 16 15:13:26.495659 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:13:26.495630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5d795" Apr 16 15:13:55.808946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:13:55.808907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:13:55.809407 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:55.809042 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 15:13:55.809407 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:13:55.809110 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs podName:9973bf97-babd-47b9-a129-38dbed119c77 nodeName:}" failed. No retries permitted until 2026-04-16 15:15:57.809094826 +0000 UTC m=+251.248635363 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs") pod "network-metrics-daemon-whwdh" (UID: "9973bf97-babd-47b9-a129-38dbed119c77") : secret "metrics-daemon-secret" not found Apr 16 15:14:06.203889 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.203852 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf"] Apr 16 15:14:06.206467 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.206452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.212326 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.212305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.213762 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.213716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 15:14:06.214258 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.213873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-x6fs6\"" Apr 16 15:14:06.214258 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.213939 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.214742 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.214651 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 15:14:06.215191 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.215168 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf"] Apr 16 15:14:06.276057 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.276030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.276179 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.276087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.276179 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.276129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxhc\" (UniqueName: \"kubernetes.io/projected/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-kube-api-access-zcxhc\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.311662 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.311631 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf"] Apr 16 15:14:06.314516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.314501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.317113 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.317095 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-t5pl2\"" Apr 16 15:14:06.317195 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.317095 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:06.317649 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.317633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 15:14:06.317733 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.317705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 15:14:06.326463 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.326435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf"] Apr 16 15:14:06.376550 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.376520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.376666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.376588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.376666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.376618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxhc\" (UniqueName: \"kubernetes.io/projected/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-kube-api-access-zcxhc\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.376666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.376647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.376666 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.376660 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:06.376854 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.376677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jnwr\" (UniqueName: \"kubernetes.io/projected/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-kube-api-access-8jnwr\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.376854 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.376759 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:06.876739009 +0000 UTC m=+140.316279563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:06.377228 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.377211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.385182 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.385165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxhc\" (UniqueName: \"kubernetes.io/projected/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-kube-api-access-zcxhc\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.405520 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.405501 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:06.408135 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.408122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.414466 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.414443 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kfj7d\"" Apr 16 15:14:06.414565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.414443 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 15:14:06.414565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.414512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 15:14:06.414565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.414452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 15:14:06.419438 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.419406 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 15:14:06.425318 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.425300 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:06.477313 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477313 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jnwr\" (UniqueName: \"kubernetes.io/projected/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-kube-api-access-8jnwr\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.477452 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 15:14:06.477515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnpth\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477749 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.477546 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls podName:18d5517b-eb91-42ce-9bcb-9f7559f1c3b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:06.977530731 +0000 UTC m=+140.417071274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls") pod "cluster-samples-operator-667775844f-6gwdf" (UID: "18d5517b-eb91-42ce-9bcb-9f7559f1c3b6") : secret "samples-operator-tls" not found Apr 16 15:14:06.477749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.477749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.477609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.487860 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.487844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jnwr\" (UniqueName: \"kubernetes.io/projected/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-kube-api-access-8jnwr\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.578454 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578555 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578555 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578555 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578555 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578742 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnpth\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578742 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578742 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578880 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.578761 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:06.578880 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.578776 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4dff87c-np7zj: secret "image-registry-tls" not found Apr 16 15:14:06.578880 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.578779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.578880 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.578823 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls podName:33e69575-fca9-4bac-82da-e821111ee7d3 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:07.078807534 +0000 UTC m=+140.518348086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls") pod "image-registry-6b4dff87c-np7zj" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3") : secret "image-registry-tls" not found Apr 16 15:14:06.579758 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.579736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.579870 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.579855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.580765 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.580747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.581059 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.581040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.587541 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.587520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.587626 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.587613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnpth\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:06.881634 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.881560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:06.881768 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.881705 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:06.881768 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.881762 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:07.88174603 +0000 UTC m=+141.321286571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:06.982227 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:06.982204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:06.982363 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.982292 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 15:14:06.982363 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:06.982359 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls podName:18d5517b-eb91-42ce-9bcb-9f7559f1c3b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:07.982346414 +0000 UTC m=+141.421886952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls") pod "cluster-samples-operator-667775844f-6gwdf" (UID: "18d5517b-eb91-42ce-9bcb-9f7559f1c3b6") : secret "samples-operator-tls" not found Apr 16 15:14:07.082779 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.082745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:07.082900 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.082882 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:07.082941 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.082902 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4dff87c-np7zj: secret "image-registry-tls" not found Apr 16 15:14:07.082975 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.082954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls podName:33e69575-fca9-4bac-82da-e821111ee7d3 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:08.08294052 +0000 UTC m=+141.522481057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls") pod "image-registry-6b4dff87c-np7zj" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3") : secret "image-registry-tls" not found Apr 16 15:14:07.170439 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.170365 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn"] Apr 16 15:14:07.174044 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.174030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.176566 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.176545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 15:14:07.176663 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.176642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 15:14:07.176732 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.176642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:14:07.176794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.176746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ktvvd\"" Apr 16 15:14:07.176794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.176768 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 15:14:07.186622 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.185082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn"] Apr 16 15:14:07.285121 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.285090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6267389f-eb5c-42ff-b628-bc0a9de3cc95-serving-cert\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.285523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.285125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6267389f-eb5c-42ff-b628-bc0a9de3cc95-config\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.285523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.285491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786jr\" (UniqueName: \"kubernetes.io/projected/6267389f-eb5c-42ff-b628-bc0a9de3cc95-kube-api-access-786jr\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.386128 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.386091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-786jr\" (UniqueName: \"kubernetes.io/projected/6267389f-eb5c-42ff-b628-bc0a9de3cc95-kube-api-access-786jr\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.386244 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.386159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6267389f-eb5c-42ff-b628-bc0a9de3cc95-serving-cert\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.386244 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.386187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6267389f-eb5c-42ff-b628-bc0a9de3cc95-config\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.386795 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.386767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6267389f-eb5c-42ff-b628-bc0a9de3cc95-config\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.388256 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.388238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6267389f-eb5c-42ff-b628-bc0a9de3cc95-serving-cert\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.394642 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.394617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-786jr\" (UniqueName: \"kubernetes.io/projected/6267389f-eb5c-42ff-b628-bc0a9de3cc95-kube-api-access-786jr\") pod \"service-ca-operator-69965bb79d-6q5dn\" (UID: \"6267389f-eb5c-42ff-b628-bc0a9de3cc95\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.483156 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.483137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" Apr 16 15:14:07.594048 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.594016 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn"] Apr 16 15:14:07.596837 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:07.596814 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6267389f_eb5c_42ff_b628_bc0a9de3cc95.slice/crio-f3799bb6dd83f69c06f59ef1d83d14f7cfa8b12087962f9678135ba86ef05087 WatchSource:0}: Error finding container f3799bb6dd83f69c06f59ef1d83d14f7cfa8b12087962f9678135ba86ef05087: Status 404 returned error can't find the container with id f3799bb6dd83f69c06f59ef1d83d14f7cfa8b12087962f9678135ba86ef05087 Apr 16 15:14:07.627746 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.627724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" event={"ID":"6267389f-eb5c-42ff-b628-bc0a9de3cc95","Type":"ContainerStarted","Data":"f3799bb6dd83f69c06f59ef1d83d14f7cfa8b12087962f9678135ba86ef05087"} Apr 16 15:14:07.890459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.890375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:07.890565 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.890498 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:07.890565 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.890556 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:09.890543198 +0000 UTC m=+143.330083735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:07.991529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:07.991499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:07.991658 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.991640 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 15:14:07.991717 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:07.991699 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls podName:18d5517b-eb91-42ce-9bcb-9f7559f1c3b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:09.991685227 +0000 UTC m=+143.431225765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls") pod "cluster-samples-operator-667775844f-6gwdf" (UID: "18d5517b-eb91-42ce-9bcb-9f7559f1c3b6") : secret "samples-operator-tls" not found Apr 16 15:14:08.092908 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.092882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:08.093011 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:08.092992 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:08.093011 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:08.093002 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4dff87c-np7zj: secret "image-registry-tls" not found Apr 16 15:14:08.093083 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:08.093042 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls podName:33e69575-fca9-4bac-82da-e821111ee7d3 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:10.09302995 +0000 UTC m=+143.532570487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls") pod "image-registry-6b4dff87c-np7zj" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3") : secret "image-registry-tls" not found Apr 16 15:14:08.776719 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.776684 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z"] Apr 16 15:14:08.779683 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.779659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" Apr 16 15:14:08.782459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.782440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-b4hqj\"" Apr 16 15:14:08.787556 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.787260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z"] Apr 16 15:14:08.900323 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:08.900294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb2bg\" (UniqueName: \"kubernetes.io/projected/632017b6-8083-4667-aebf-a3a5d171f0a9-kube-api-access-sb2bg\") pod \"network-check-source-7b678d77c7-vr69z\" (UID: \"632017b6-8083-4667-aebf-a3a5d171f0a9\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" Apr 16 15:14:09.001606 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.001574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb2bg\" (UniqueName: \"kubernetes.io/projected/632017b6-8083-4667-aebf-a3a5d171f0a9-kube-api-access-sb2bg\") pod \"network-check-source-7b678d77c7-vr69z\" (UID: \"632017b6-8083-4667-aebf-a3a5d171f0a9\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" Apr 16 15:14:09.010301 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.010271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb2bg\" (UniqueName: \"kubernetes.io/projected/632017b6-8083-4667-aebf-a3a5d171f0a9-kube-api-access-sb2bg\") pod \"network-check-source-7b678d77c7-vr69z\" (UID: \"632017b6-8083-4667-aebf-a3a5d171f0a9\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" Apr 16 15:14:09.090314 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.090245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" Apr 16 15:14:09.207876 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.207845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z"] Apr 16 15:14:09.211555 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:09.211531 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632017b6_8083_4667_aebf_a3a5d171f0a9.slice/crio-761e3296a67bf80c01eca547ad67c4778134c5d807dad85d3bc91c44c7d52348 WatchSource:0}: Error finding container 761e3296a67bf80c01eca547ad67c4778134c5d807dad85d3bc91c44c7d52348: Status 404 returned error can't find the container with id 761e3296a67bf80c01eca547ad67c4778134c5d807dad85d3bc91c44c7d52348 Apr 16 15:14:09.633572 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.633535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" event={"ID":"632017b6-8083-4667-aebf-a3a5d171f0a9","Type":"ContainerStarted","Data":"bcccd955b26fa46285553fa5012a3e8e19a38bbf886eac16c7d229bce7e67ae3"} Apr 16 15:14:09.633572 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.633575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" event={"ID":"632017b6-8083-4667-aebf-a3a5d171f0a9","Type":"ContainerStarted","Data":"761e3296a67bf80c01eca547ad67c4778134c5d807dad85d3bc91c44c7d52348"} Apr 16 15:14:09.651306 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.651252 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vr69z" podStartSLOduration=1.651234527 podStartE2EDuration="1.651234527s" podCreationTimestamp="2026-04-16 15:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:09.649439704 +0000 UTC m=+143.088980266" watchObservedRunningTime="2026-04-16 15:14:09.651234527 +0000 UTC m=+143.090775095" Apr 16 15:14:09.907876 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:09.907795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:09.908187 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:09.907932 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:09.908187 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:09.907997 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:13.907979168 +0000 UTC m=+147.347519713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:10.008768 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:10.008739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:10.008910 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:10.008891 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 15:14:10.008956 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:10.008948 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls podName:18d5517b-eb91-42ce-9bcb-9f7559f1c3b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:14.008934138 +0000 UTC m=+147.448474676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls") pod "cluster-samples-operator-667775844f-6gwdf" (UID: "18d5517b-eb91-42ce-9bcb-9f7559f1c3b6") : secret "samples-operator-tls" not found Apr 16 15:14:10.109135 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:10.109101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:10.109279 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:10.109209 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:10.109279 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:10.109225 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4dff87c-np7zj: secret "image-registry-tls" not found Apr 16 15:14:10.109279 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:10.109278 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls podName:33e69575-fca9-4bac-82da-e821111ee7d3 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:14.109262001 +0000 UTC m=+147.548802549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls") pod "image-registry-6b4dff87c-np7zj" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3") : secret "image-registry-tls" not found Apr 16 15:14:10.636895 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:10.636856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" event={"ID":"6267389f-eb5c-42ff-b628-bc0a9de3cc95","Type":"ContainerStarted","Data":"febf20f65131e6aad082ec321180c81025c33af708c856f3fd5b50495ade80df"} Apr 16 15:14:10.653811 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:10.653771 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" podStartSLOduration=1.313806463 podStartE2EDuration="3.65375802s" podCreationTimestamp="2026-04-16 15:14:07 +0000 UTC" firstStartedPulling="2026-04-16 15:14:07.598630935 +0000 UTC m=+141.038171476" lastFinishedPulling="2026-04-16 15:14:09.938582479 +0000 UTC m=+143.378123033" observedRunningTime="2026-04-16 15:14:10.652879746 +0000 UTC m=+144.092420307" watchObservedRunningTime="2026-04-16 15:14:10.65375802 +0000 UTC m=+144.093298579" Apr 16 15:14:13.841478 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.841443 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sszsc"] Apr 16 15:14:13.844608 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.844592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:13.847095 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.847077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 15:14:13.847211 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.847079 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 15:14:13.847308 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.847287 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 15:14:13.848307 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.848291 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 15:14:13.848307 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.848300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-5j4x2\"" Apr 16 15:14:13.854170 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.854150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sszsc"] Apr 16 15:14:13.936235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.936216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6j5z\" (UniqueName: \"kubernetes.io/projected/d1b61f91-d97e-4859-b866-4ff2858bbcce-kube-api-access-t6j5z\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:13.936331 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.936246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-cabundle\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:13.936331 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.936270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-key\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:13.936405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:13.936370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:13.936504 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:13.936491 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:13.936562 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:13.936552 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:21.936536036 +0000 UTC m=+155.376076576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:14.036816 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.036791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-cabundle\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.036915 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.036832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-key\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.036978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.036961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:14.037013 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.036996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6j5z\" (UniqueName: \"kubernetes.io/projected/d1b61f91-d97e-4859-b866-4ff2858bbcce-kube-api-access-t6j5z\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.037094 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:14.037076 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 15:14:14.037157 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:14.037147 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls podName:18d5517b-eb91-42ce-9bcb-9f7559f1c3b6 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:22.037131209 +0000 UTC m=+155.476671755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls") pod "cluster-samples-operator-667775844f-6gwdf" (UID: "18d5517b-eb91-42ce-9bcb-9f7559f1c3b6") : secret "samples-operator-tls" not found Apr 16 15:14:14.037549 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.037524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-cabundle\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.039144 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.039127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1b61f91-d97e-4859-b866-4ff2858bbcce-signing-key\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.045069 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.045050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6j5z\" (UniqueName: \"kubernetes.io/projected/d1b61f91-d97e-4859-b866-4ff2858bbcce-kube-api-access-t6j5z\") pod \"service-ca-bfc587fb7-sszsc\" (UID: \"d1b61f91-d97e-4859-b866-4ff2858bbcce\") " pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.138323 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.138270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:14.138414 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:14.138399 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 15:14:14.138486 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:14.138416 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4dff87c-np7zj: secret "image-registry-tls" not found Apr 16 15:14:14.138486 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:14.138482 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls podName:33e69575-fca9-4bac-82da-e821111ee7d3 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:22.138470334 +0000 UTC m=+155.578010877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls") pod "image-registry-6b4dff87c-np7zj" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3") : secret "image-registry-tls" not found Apr 16 15:14:14.154173 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.154154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" Apr 16 15:14:14.264574 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.264547 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sszsc"] Apr 16 15:14:14.267480 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:14.267414 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b61f91_d97e_4859_b866_4ff2858bbcce.slice/crio-01615a6fcc6c2647d79f6579ddb4e4679856aa9428cf0a82e49a0d88adefc512 WatchSource:0}: Error finding container 01615a6fcc6c2647d79f6579ddb4e4679856aa9428cf0a82e49a0d88adefc512: Status 404 returned error can't find the container with id 01615a6fcc6c2647d79f6579ddb4e4679856aa9428cf0a82e49a0d88adefc512 Apr 16 15:14:14.646825 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.646788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" event={"ID":"d1b61f91-d97e-4859-b866-4ff2858bbcce","Type":"ContainerStarted","Data":"fd840db332b07b593b0d0b46eb85314d4d6c327bcf934197bb97f61f6d8a9e49"} Apr 16 15:14:14.646825 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.646831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" event={"ID":"d1b61f91-d97e-4859-b866-4ff2858bbcce","Type":"ContainerStarted","Data":"01615a6fcc6c2647d79f6579ddb4e4679856aa9428cf0a82e49a0d88adefc512"} Apr 16 15:14:14.665387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.665343 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-sszsc" podStartSLOduration=1.665330975 podStartE2EDuration="1.665330975s" podCreationTimestamp="2026-04-16 15:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:14.664064628 +0000 UTC m=+148.103605189" watchObservedRunningTime="2026-04-16 15:14:14.665330975 +0000 UTC m=+148.104871535" Apr 16 15:14:14.927746 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:14.927682 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kjtwg_74d85d78-e8d8-4b5c-a950-f65047122164/dns-node-resolver/0.log" Apr 16 15:14:16.127265 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:16.127241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-thxgc_b6b5389a-1cfb-46bd-bdee-65b24755f000/node-ca/0.log" Apr 16 15:14:22.000408 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.000365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:22.000798 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:22.000540 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:22.000798 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:22.000617 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls podName:68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e nodeName:}" failed. No retries permitted until 2026-04-16 15:14:38.000601544 +0000 UTC m=+171.440142103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jdrgf" (UID: "68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e") : secret "cluster-monitoring-operator-tls" not found Apr 16 15:14:22.100952 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.100922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:22.103256 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.103231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18d5517b-eb91-42ce-9bcb-9f7559f1c3b6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6gwdf\" (UID: \"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:22.202054 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.202017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:22.204468 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.204438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"image-registry-6b4dff87c-np7zj\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:22.222633 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.222607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" Apr 16 15:14:22.316892 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.316864 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:22.339735 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.339710 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf"] Apr 16 15:14:22.439041 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.439008 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:22.441871 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:22.441844 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e69575_fca9_4bac_82da_e821111ee7d3.slice/crio-2cbf2382a9e5b3116e6cd2377b851b169e8b2e547e5a903e18ed462cb2eec37b WatchSource:0}: Error finding container 2cbf2382a9e5b3116e6cd2377b851b169e8b2e547e5a903e18ed462cb2eec37b: Status 404 returned error can't find the container with id 2cbf2382a9e5b3116e6cd2377b851b169e8b2e547e5a903e18ed462cb2eec37b Apr 16 15:14:22.669132 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.669038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" event={"ID":"33e69575-fca9-4bac-82da-e821111ee7d3","Type":"ContainerStarted","Data":"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b"} Apr 16 15:14:22.669132 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.669080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" event={"ID":"33e69575-fca9-4bac-82da-e821111ee7d3","Type":"ContainerStarted","Data":"2cbf2382a9e5b3116e6cd2377b851b169e8b2e547e5a903e18ed462cb2eec37b"} Apr 16 15:14:22.669335 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.669170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:22.670111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.670085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" event={"ID":"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6","Type":"ContainerStarted","Data":"54d567fced5934ce0594a1003b04075b58e3702761f343d015af08587b23c1c6"} Apr 16 15:14:22.688570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:22.688522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" podStartSLOduration=16.688505549 podStartE2EDuration="16.688505549s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:22.688295685 +0000 UTC m=+156.127836248" watchObservedRunningTime="2026-04-16 15:14:22.688505549 +0000 UTC m=+156.128046110" Apr 16 15:14:23.030740 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:23.030700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jrg65" podUID="d01959bc-3d04-456b-9dbe-ea153e10fa05" Apr 16 15:14:23.043979 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:23.043941 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pm67f" podUID="29f4a0db-de17-476d-97ad-df37fd2a5065" Apr 16 15:14:23.161765 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:23.161704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-whwdh" podUID="9973bf97-babd-47b9-a129-38dbed119c77" Apr 16 15:14:23.672965 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:23.672933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:14:23.673156 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:23.673122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:24.677107 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:24.677071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" event={"ID":"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6","Type":"ContainerStarted","Data":"60d8b0674c16121f9a3352344c66f58d8705e67eb18094b3f9d2f8c586a0aa77"} Apr 16 15:14:24.677107 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:24.677112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" event={"ID":"18d5517b-eb91-42ce-9bcb-9f7559f1c3b6","Type":"ContainerStarted","Data":"8599fc1b84b1c08c54a9cc4ed5b57d0a5ea16d03221dbf1daa3ee113d2138c53"} Apr 16 15:14:24.693486 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:24.693444 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6gwdf" podStartSLOduration=17.172901942 podStartE2EDuration="18.693410853s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:22.384747171 +0000 UTC m=+155.824287714" lastFinishedPulling="2026-04-16 15:14:23.905256087 +0000 UTC m=+157.344796625" observedRunningTime="2026-04-16 15:14:24.69256391 +0000 UTC m=+158.132104480" watchObservedRunningTime="2026-04-16 15:14:24.693410853 +0000 UTC m=+158.132951412" Apr 16 15:14:27.947277 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:27.947247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:14:27.947752 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:27.947296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:27.949508 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:27.949487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29f4a0db-de17-476d-97ad-df37fd2a5065-metrics-tls\") pod \"dns-default-pm67f\" (UID: \"29f4a0db-de17-476d-97ad-df37fd2a5065\") " pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:27.949649 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:27.949631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01959bc-3d04-456b-9dbe-ea153e10fa05-cert\") pod \"ingress-canary-jrg65\" (UID: \"d01959bc-3d04-456b-9dbe-ea153e10fa05\") " pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:14:28.176191 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.176162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xrkdr\"" Apr 16 15:14:28.177193 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.177178 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rc5hb\"" Apr 16 15:14:28.184139 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.184111 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrg65" Apr 16 15:14:28.184139 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.184136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:28.327655 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.327627 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pm67f"] Apr 16 15:14:28.330290 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:28.330251 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f4a0db_de17_476d_97ad_df37fd2a5065.slice/crio-0b1432aacc96762f3344db25bee76b110d7486c5cf7c44a4ccb6a4e5054095d5 WatchSource:0}: Error finding container 0b1432aacc96762f3344db25bee76b110d7486c5cf7c44a4ccb6a4e5054095d5: Status 404 returned error can't find the container with id 0b1432aacc96762f3344db25bee76b110d7486c5cf7c44a4ccb6a4e5054095d5 Apr 16 15:14:28.342662 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.342641 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrg65"] Apr 16 15:14:28.344878 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:28.344855 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01959bc_3d04_456b_9dbe_ea153e10fa05.slice/crio-b706500c43ecde97f99f5a33edeee8151126ee8f5c80ec157c0c73d97f2b9208 WatchSource:0}: Error finding container b706500c43ecde97f99f5a33edeee8151126ee8f5c80ec157c0c73d97f2b9208: Status 404 returned error can't find the container with id b706500c43ecde97f99f5a33edeee8151126ee8f5c80ec157c0c73d97f2b9208 Apr 16 15:14:28.688565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.688533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrg65" event={"ID":"d01959bc-3d04-456b-9dbe-ea153e10fa05","Type":"ContainerStarted","Data":"b706500c43ecde97f99f5a33edeee8151126ee8f5c80ec157c0c73d97f2b9208"} Apr 16 15:14:28.689681 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:28.689653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm67f" event={"ID":"29f4a0db-de17-476d-97ad-df37fd2a5065","Type":"ContainerStarted","Data":"0b1432aacc96762f3344db25bee76b110d7486c5cf7c44a4ccb6a4e5054095d5"} Apr 16 15:14:30.695812 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.695780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm67f" event={"ID":"29f4a0db-de17-476d-97ad-df37fd2a5065","Type":"ContainerStarted","Data":"626226ba70df7c222f11148fbb8851e6ae21db6c3e02707a69c89076d1a85749"} Apr 16 15:14:30.695812 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.695815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pm67f" event={"ID":"29f4a0db-de17-476d-97ad-df37fd2a5065","Type":"ContainerStarted","Data":"a1ee195a3fd16c628963888899ea6cb4dc4be21b58312787f8c1bb4aa6829822"} Apr 16 15:14:30.696304 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.695919 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:30.697080 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.697052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrg65" event={"ID":"d01959bc-3d04-456b-9dbe-ea153e10fa05","Type":"ContainerStarted","Data":"584e82f61579ad534a7bfc5a73c508fc18f8c23880cca5d9a00bdf88ea811872"} Apr 16 15:14:30.716448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.716297 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pm67f" podStartSLOduration=130.061863298 podStartE2EDuration="2m11.716283199s" podCreationTimestamp="2026-04-16 15:12:19 +0000 UTC" firstStartedPulling="2026-04-16 15:14:28.332521788 +0000 UTC m=+161.772062330" lastFinishedPulling="2026-04-16 15:14:29.98694169 +0000 UTC m=+163.426482231" observedRunningTime="2026-04-16 15:14:30.71413639 +0000 UTC m=+164.153676949" watchObservedRunningTime="2026-04-16 15:14:30.716283199 +0000 UTC m=+164.155823762" Apr 16 15:14:30.731165 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:30.731126 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jrg65" podStartSLOduration=130.087341328 podStartE2EDuration="2m11.731116518s" podCreationTimestamp="2026-04-16 15:12:19 +0000 UTC" firstStartedPulling="2026-04-16 15:14:28.346688052 +0000 UTC m=+161.786228590" lastFinishedPulling="2026-04-16 15:14:29.990463226 +0000 UTC m=+163.430003780" observedRunningTime="2026-04-16 15:14:30.730648702 +0000 UTC m=+164.170189261" watchObservedRunningTime="2026-04-16 15:14:30.731116518 +0000 UTC m=+164.170657078" Apr 16 15:14:32.042515 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.042484 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-558cz"] Apr 16 15:14:32.045241 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.045219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.048734 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.048715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 15:14:32.049322 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.049305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 15:14:32.049366 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.049324 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 15:14:32.049861 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.049828 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 15:14:32.049861 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.049856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wvcg7\"" Apr 16 15:14:32.061336 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.061318 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-558cz"] Apr 16 15:14:32.065629 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.065609 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:32.100700 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.100675 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bd44bc866-ll72x"] Apr 16 15:14:32.102329 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.102315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.119571 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.119554 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bd44bc866-ll72x"] Apr 16 15:14:32.177494 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.177473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ctfb\" (UniqueName: \"kubernetes.io/projected/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-api-access-2ctfb\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.177606 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.177505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b24b7c4b-6257-4d81-8ce8-c2579532c74c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.177606 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.177537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b24b7c4b-6257-4d81-8ce8-c2579532c74c-crio-socket\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.177706 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.177613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b24b7c4b-6257-4d81-8ce8-c2579532c74c-data-volume\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.177706 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.177632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278213 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ctfb\" (UniqueName: \"kubernetes.io/projected/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-api-access-2ctfb\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278333 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-bound-sa-token\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278333 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b24b7c4b-6257-4d81-8ce8-c2579532c74c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278333 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbg5\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-kube-api-access-rnbg5\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278333 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-trusted-ca\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-certificates\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b24b7c4b-6257-4d81-8ce8-c2579532c74c-crio-socket\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-image-registry-private-configuration\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f04e3364-1f42-4f56-83fb-a1a55799b48e-ca-trust-extracted\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b24b7c4b-6257-4d81-8ce8-c2579532c74c-crio-socket\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-installation-pull-secrets\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278858 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b24b7c4b-6257-4d81-8ce8-c2579532c74c-data-volume\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278858 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.278858 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-tls\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.278995 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.278922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b24b7c4b-6257-4d81-8ce8-c2579532c74c-data-volume\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.279269 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.279250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.280574 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.280558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b24b7c4b-6257-4d81-8ce8-c2579532c74c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.287238 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.287215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ctfb\" (UniqueName: \"kubernetes.io/projected/b24b7c4b-6257-4d81-8ce8-c2579532c74c-kube-api-access-2ctfb\") pod \"insights-runtime-extractor-558cz\" (UID: \"b24b7c4b-6257-4d81-8ce8-c2579532c74c\") " pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.354532 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.354484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-558cz" Apr 16 15:14:32.379969 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.379945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-tls\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380072 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.379985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-bound-sa-token\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380072 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbg5\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-kube-api-access-rnbg5\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380072 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-trusted-ca\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380220 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-certificates\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380220 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-image-registry-private-configuration\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380341 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f04e3364-1f42-4f56-83fb-a1a55799b48e-ca-trust-extracted\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380409 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-installation-pull-secrets\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.380791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.380742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f04e3364-1f42-4f56-83fb-a1a55799b48e-ca-trust-extracted\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.381050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.381026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-certificates\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.381134 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.381093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f04e3364-1f42-4f56-83fb-a1a55799b48e-trusted-ca\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.382602 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.382583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-image-registry-private-configuration\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.382713 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.382696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f04e3364-1f42-4f56-83fb-a1a55799b48e-installation-pull-secrets\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.383001 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.382985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-registry-tls\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.388948 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.388842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-bound-sa-token\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.388948 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.388859 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbg5\" (UniqueName: \"kubernetes.io/projected/f04e3364-1f42-4f56-83fb-a1a55799b48e-kube-api-access-rnbg5\") pod \"image-registry-7bd44bc866-ll72x\" (UID: \"f04e3364-1f42-4f56-83fb-a1a55799b48e\") " pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.411163 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.411140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.478037 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.477609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-558cz"] Apr 16 15:14:32.480610 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:32.480580 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb24b7c4b_6257_4d81_8ce8_c2579532c74c.slice/crio-c9c22021089b34ccbc9c7626dee7d8c993e1ac641a7ddf8c6196b1f78ccd3fc5 WatchSource:0}: Error finding container c9c22021089b34ccbc9c7626dee7d8c993e1ac641a7ddf8c6196b1f78ccd3fc5: Status 404 returned error can't find the container with id c9c22021089b34ccbc9c7626dee7d8c993e1ac641a7ddf8c6196b1f78ccd3fc5 Apr 16 15:14:32.545489 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.545462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bd44bc866-ll72x"] Apr 16 15:14:32.548250 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:32.548225 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04e3364_1f42_4f56_83fb_a1a55799b48e.slice/crio-33f95f3440efef159ba837276981f62a5ab06a1b8c8f62682e0d9a677c23eabf WatchSource:0}: Error finding container 33f95f3440efef159ba837276981f62a5ab06a1b8c8f62682e0d9a677c23eabf: Status 404 returned error can't find the container with id 33f95f3440efef159ba837276981f62a5ab06a1b8c8f62682e0d9a677c23eabf Apr 16 15:14:32.703905 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.703869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" event={"ID":"f04e3364-1f42-4f56-83fb-a1a55799b48e","Type":"ContainerStarted","Data":"a84b464676027d47a9753fe2a4570f87b309406721e8787240ea96deaf6d1537"} Apr 16 15:14:32.703905 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.703911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" event={"ID":"f04e3364-1f42-4f56-83fb-a1a55799b48e","Type":"ContainerStarted","Data":"33f95f3440efef159ba837276981f62a5ab06a1b8c8f62682e0d9a677c23eabf"} Apr 16 15:14:32.704135 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.703950 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:32.705154 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.705132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-558cz" event={"ID":"b24b7c4b-6257-4d81-8ce8-c2579532c74c","Type":"ContainerStarted","Data":"e61cf5e90456e6a86d0a56cd60eab7129311a552a2afd57b5f704db505f1467f"} Apr 16 15:14:32.705242 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.705159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-558cz" event={"ID":"b24b7c4b-6257-4d81-8ce8-c2579532c74c","Type":"ContainerStarted","Data":"c9c22021089b34ccbc9c7626dee7d8c993e1ac641a7ddf8c6196b1f78ccd3fc5"} Apr 16 15:14:32.723769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:32.723729 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" podStartSLOduration=0.72371752 podStartE2EDuration="723.71752ms" podCreationTimestamp="2026-04-16 15:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:32.722593554 +0000 UTC m=+166.162134113" watchObservedRunningTime="2026-04-16 15:14:32.72371752 +0000 UTC m=+166.163258080" Apr 16 15:14:33.710086 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:33.710049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-558cz" event={"ID":"b24b7c4b-6257-4d81-8ce8-c2579532c74c","Type":"ContainerStarted","Data":"88a01dec3605eeed0221ddd2db39fe2aa2f7f7534aa5fe351dca1e17a2067a0a"} Apr 16 15:14:35.150040 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:35.149968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:14:35.716342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:35.716311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-558cz" event={"ID":"b24b7c4b-6257-4d81-8ce8-c2579532c74c","Type":"ContainerStarted","Data":"217fd4064b10520d95a7526885554060a168e8e7d9375a18b34d7e997c7ba82c"} Apr 16 15:14:35.734957 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:35.734915 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-558cz" podStartSLOduration=1.441400891 podStartE2EDuration="3.734901566s" podCreationTimestamp="2026-04-16 15:14:32 +0000 UTC" firstStartedPulling="2026-04-16 15:14:32.5399872 +0000 UTC m=+165.979527737" lastFinishedPulling="2026-04-16 15:14:34.833487861 +0000 UTC m=+168.273028412" observedRunningTime="2026-04-16 15:14:35.734599299 +0000 UTC m=+169.174139860" watchObservedRunningTime="2026-04-16 15:14:35.734901566 +0000 UTC m=+169.174442125" Apr 16 15:14:38.029931 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:38.029887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:38.032463 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:38.032414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jdrgf\" (UID: \"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:38.319110 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:38.319026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" Apr 16 15:14:38.432861 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:38.432825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf"] Apr 16 15:14:38.435284 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:38.435252 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68dfc5e0_5a22_4b10_9af8_2235bd3e6c1e.slice/crio-e5262a73f532781171c2b4d4c189315cafbecca51e9013b1e63447fa4e31146b WatchSource:0}: Error finding container e5262a73f532781171c2b4d4c189315cafbecca51e9013b1e63447fa4e31146b: Status 404 returned error can't find the container with id e5262a73f532781171c2b4d4c189315cafbecca51e9013b1e63447fa4e31146b Apr 16 15:14:38.724286 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:38.724253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" event={"ID":"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e","Type":"ContainerStarted","Data":"e5262a73f532781171c2b4d4c189315cafbecca51e9013b1e63447fa4e31146b"} Apr 16 15:14:40.701794 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:40.701767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pm67f" Apr 16 15:14:40.730661 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:40.730636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" event={"ID":"68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e","Type":"ContainerStarted","Data":"5ccc43aa39c429beefbc8bd02cc328b3df1f5022c860c7f6d516718ac41d1486"} Apr 16 15:14:40.750258 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:40.750179 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jdrgf" podStartSLOduration=33.113811174 podStartE2EDuration="34.750161199s" podCreationTimestamp="2026-04-16 15:14:06 +0000 UTC" firstStartedPulling="2026-04-16 15:14:38.437091949 +0000 UTC m=+171.876632491" lastFinishedPulling="2026-04-16 15:14:40.073441967 +0000 UTC m=+173.512982516" observedRunningTime="2026-04-16 15:14:40.748584086 +0000 UTC m=+174.188124656" watchObservedRunningTime="2026-04-16 15:14:40.750161199 +0000 UTC m=+174.189701760" Apr 16 15:14:42.071269 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:42.071233 2575 patch_prober.go:28] interesting pod/image-registry-6b4dff87c-np7zj container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 15:14:42.071654 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:42.071286 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:14:49.001144 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.001111 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2kklt"] Apr 16 15:14:49.005731 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.005712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.008610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.008588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zt9c4\"" Apr 16 15:14:49.008767 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.008619 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 15:14:49.009769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.009741 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 15:14:49.009859 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.009751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 15:14:49.014791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.014773 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2kklt"] Apr 16 15:14:49.022106 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.022086 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dmgkg"] Apr 16 15:14:49.025655 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.025639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.028655 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.028633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 15:14:49.028775 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.028661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 15:14:49.028963 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.028945 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 15:14:49.029040 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.028985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8z8zc\"" Apr 16 15:14:49.043774 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.043754 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6d72x"] Apr 16 15:14:49.046793 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.046778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.049388 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.049370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-svwk8\"" Apr 16 15:14:49.049506 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.049452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 15:14:49.049577 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.049390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 15:14:49.049577 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.049374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 15:14:49.064398 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.064376 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6d72x"] Apr 16 15:14:49.102060 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.102185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-tls\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102356 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102356 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-sys\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102356 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-root\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102356 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d5vn\" (UniqueName: \"kubernetes.io/projected/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-kube-api-access-7d5vn\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102356 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/67d34f25-682d-44ce-b253-d768fba43c67-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-wtmp\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.102600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg64t\" (UniqueName: \"kubernetes.io/projected/808117b5-475a-4911-9e51-8d3a34537662-kube-api-access-sg64t\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.102785 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/808117b5-475a-4911-9e51-8d3a34537662-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.102785 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s984n\" (UniqueName: \"kubernetes.io/projected/67d34f25-682d-44ce-b253-d768fba43c67-kube-api-access-s984n\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.102785 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-textfile\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.102785 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.102704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-metrics-client-ca\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.203703 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.203669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/808117b5-475a-4911-9e51-8d3a34537662-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.203877 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.203708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s984n\" (UniqueName: \"kubernetes.io/projected/67d34f25-682d-44ce-b253-d768fba43c67-kube-api-access-s984n\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.203877 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.203737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-textfile\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204010 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.203986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-metrics-client-ca\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204068 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.204126 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.204126 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-textfile\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204126 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-tls\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204287 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204287 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.204287 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-sys\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-sys\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-root\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d5vn\" (UniqueName: \"kubernetes.io/projected/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-kube-api-access-7d5vn\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/67d34f25-682d-44ce-b253-d768fba43c67-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.204459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-wtmp\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/808117b5-475a-4911-9e51-8d3a34537662-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-metrics-client-ca\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.204709 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg64t\" (UniqueName: \"kubernetes.io/projected/808117b5-475a-4911-9e51-8d3a34537662-kube-api-access-sg64t\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.205034 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.205034 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.204912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-wtmp\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.205034 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:49.204996 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 15:14:49.205146 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:49.205061 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls podName:808117b5-475a-4911-9e51-8d3a34537662 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:49.705041249 +0000 UTC m=+183.144581798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-2kklt" (UID: "808117b5-475a-4911-9e51-8d3a34537662") : secret "openshift-state-metrics-tls" not found Apr 16 15:14:49.205313 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.205289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.205379 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.205363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-root\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.205607 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.205589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d34f25-682d-44ce-b253-d768fba43c67-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.205717 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.205696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/67d34f25-682d-44ce-b253-d768fba43c67-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.206946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.206919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.207155 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.207131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.207335 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.207313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67d34f25-682d-44ce-b253-d768fba43c67-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.207393 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.207321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-tls\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.208601 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.208578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.223359 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.223331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s984n\" (UniqueName: \"kubernetes.io/projected/67d34f25-682d-44ce-b253-d768fba43c67-kube-api-access-s984n\") pod \"kube-state-metrics-7479c89684-6d72x\" (UID: \"67d34f25-682d-44ce-b253-d768fba43c67\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.223543 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.223524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg64t\" (UniqueName: \"kubernetes.io/projected/808117b5-475a-4911-9e51-8d3a34537662-kube-api-access-sg64t\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.223593 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.223554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d5vn\" (UniqueName: \"kubernetes.io/projected/e24a16f5-1dcd-41d0-8b97-b13b1ecdb278-kube-api-access-7d5vn\") pod \"node-exporter-dmgkg\" (UID: \"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278\") " pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.335920 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.335855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmgkg" Apr 16 15:14:49.347450 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:49.347410 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24a16f5_1dcd_41d0_8b97_b13b1ecdb278.slice/crio-b5985096af59ba8e5860a8a37d162579f04edb756fe134987a37e5f4ee0b8dbe WatchSource:0}: Error finding container b5985096af59ba8e5860a8a37d162579f04edb756fe134987a37e5f4ee0b8dbe: Status 404 returned error can't find the container with id b5985096af59ba8e5860a8a37d162579f04edb756fe134987a37e5f4ee0b8dbe Apr 16 15:14:49.355686 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.355668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" Apr 16 15:14:49.481194 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.481145 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6d72x"] Apr 16 15:14:49.484037 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:49.484011 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d34f25_682d_44ce_b253_d768fba43c67.slice/crio-fcac21ae851697097f3b9d7e36b89f986daf797966996e2837bd06a8cc386da1 WatchSource:0}: Error finding container fcac21ae851697097f3b9d7e36b89f986daf797966996e2837bd06a8cc386da1: Status 404 returned error can't find the container with id fcac21ae851697097f3b9d7e36b89f986daf797966996e2837bd06a8cc386da1 Apr 16 15:14:49.709241 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.709215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.711486 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.711467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/808117b5-475a-4911-9e51-8d3a34537662-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2kklt\" (UID: \"808117b5-475a-4911-9e51-8d3a34537662\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:49.753498 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.753472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" event={"ID":"67d34f25-682d-44ce-b253-d768fba43c67","Type":"ContainerStarted","Data":"fcac21ae851697097f3b9d7e36b89f986daf797966996e2837bd06a8cc386da1"} Apr 16 15:14:49.754583 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.754558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmgkg" event={"ID":"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278","Type":"ContainerStarted","Data":"b5985096af59ba8e5860a8a37d162579f04edb756fe134987a37e5f4ee0b8dbe"} Apr 16 15:14:49.917097 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:49.917066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" Apr 16 15:14:50.087724 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.087674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2kklt"] Apr 16 15:14:50.260242 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:50.260105 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808117b5_475a_4911_9e51_8d3a34537662.slice/crio-b371fd1405e75cc3e98c536e3ece82aa75e9cbe2482c4a8385d466fa4eceaabc WatchSource:0}: Error finding container b371fd1405e75cc3e98c536e3ece82aa75e9cbe2482c4a8385d466fa4eceaabc: Status 404 returned error can't find the container with id b371fd1405e75cc3e98c536e3ece82aa75e9cbe2482c4a8385d466fa4eceaabc Apr 16 15:14:50.758955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.758888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" event={"ID":"808117b5-475a-4911-9e51-8d3a34537662","Type":"ContainerStarted","Data":"960a3e34b58aec6003e941130567e5c12668290a738d1104088d3be7c5bd4868"} Apr 16 15:14:50.758955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.758930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" event={"ID":"808117b5-475a-4911-9e51-8d3a34537662","Type":"ContainerStarted","Data":"cdacea6dda7dc7e1d4ba383c5d74c725bdf4b7ba49a48015f8b962cfcea0c366"} Apr 16 15:14:50.758955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.758943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" event={"ID":"808117b5-475a-4911-9e51-8d3a34537662","Type":"ContainerStarted","Data":"b371fd1405e75cc3e98c536e3ece82aa75e9cbe2482c4a8385d466fa4eceaabc"} Apr 16 15:14:50.760354 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.760322 2575 generic.go:358] "Generic (PLEG): container finished" podID="e24a16f5-1dcd-41d0-8b97-b13b1ecdb278" containerID="af7bf60ff3bfda5b1a4bec5f29edd1034b22fdba7ff68bbd27328161b78e3e2b" exitCode=0 Apr 16 15:14:50.760470 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:50.760365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmgkg" event={"ID":"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278","Type":"ContainerDied","Data":"af7bf60ff3bfda5b1a4bec5f29edd1034b22fdba7ff68bbd27328161b78e3e2b"} Apr 16 15:14:51.766570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.766538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmgkg" event={"ID":"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278","Type":"ContainerStarted","Data":"89f1ebd7c2414280feb5f79da23b683fb0283509ef570e10467de8304f1fd8aa"} Apr 16 15:14:51.766570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.766574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmgkg" event={"ID":"e24a16f5-1dcd-41d0-8b97-b13b1ecdb278","Type":"ContainerStarted","Data":"201dc0d5b394c39114fbbaf95e997530b4a97bee238fc90f04307a694753ce4f"} Apr 16 15:14:51.768385 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.768360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" event={"ID":"67d34f25-682d-44ce-b253-d768fba43c67","Type":"ContainerStarted","Data":"7b05c43649ea056795455531f432701524cfe2c34bd522d0524a2bc4675047fe"} Apr 16 15:14:51.768385 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.768387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" event={"ID":"67d34f25-682d-44ce-b253-d768fba43c67","Type":"ContainerStarted","Data":"dba2100a233cd6e8178c18b5c560981f8fb13ef971f15e27bcc16a68eb296afb"} Apr 16 15:14:51.768547 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.768400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" event={"ID":"67d34f25-682d-44ce-b253-d768fba43c67","Type":"ContainerStarted","Data":"df0b1a7d7c81440aaa4b8a6e605ef6bf07dfa2d531194fd36c18629c3f165813"} Apr 16 15:14:51.794857 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.794809 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dmgkg" podStartSLOduration=2.82594742 podStartE2EDuration="3.794796311s" podCreationTimestamp="2026-04-16 15:14:48 +0000 UTC" firstStartedPulling="2026-04-16 15:14:49.349876303 +0000 UTC m=+182.789416856" lastFinishedPulling="2026-04-16 15:14:50.318725191 +0000 UTC m=+183.758265747" observedRunningTime="2026-04-16 15:14:51.792784118 +0000 UTC m=+185.232324678" watchObservedRunningTime="2026-04-16 15:14:51.794796311 +0000 UTC m=+185.234337004" Apr 16 15:14:51.816926 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:51.816884 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-6d72x" podStartSLOduration=1.54906742 podStartE2EDuration="2.816872084s" podCreationTimestamp="2026-04-16 15:14:49 +0000 UTC" firstStartedPulling="2026-04-16 15:14:49.486029319 +0000 UTC m=+182.925569860" lastFinishedPulling="2026-04-16 15:14:50.753833978 +0000 UTC m=+184.193374524" observedRunningTime="2026-04-16 15:14:51.815665392 +0000 UTC m=+185.255205964" watchObservedRunningTime="2026-04-16 15:14:51.816872084 +0000 UTC m=+185.256412687" Apr 16 15:14:52.070225 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:52.070150 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:52.773271 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:52.773231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" event={"ID":"808117b5-475a-4911-9e51-8d3a34537662","Type":"ContainerStarted","Data":"ae629b8797b8afd8c52770089cc982ea78d08db021455a9091905cbe7a31f7e4"} Apr 16 15:14:52.803116 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:52.803055 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2kklt" podStartSLOduration=3.392832856 podStartE2EDuration="4.80303789s" podCreationTimestamp="2026-04-16 15:14:48 +0000 UTC" firstStartedPulling="2026-04-16 15:14:50.412662137 +0000 UTC m=+183.852202687" lastFinishedPulling="2026-04-16 15:14:51.822867182 +0000 UTC m=+185.262407721" observedRunningTime="2026-04-16 15:14:52.802176743 +0000 UTC m=+186.241717303" watchObservedRunningTime="2026-04-16 15:14:52.80303789 +0000 UTC m=+186.242578449" Apr 16 15:14:53.714065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.714039 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bd44bc866-ll72x" Apr 16 15:14:53.792798 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.792770 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg"] Apr 16 15:14:53.795866 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.795848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:53.798758 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.798736 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-djv46\"" Apr 16 15:14:53.798758 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.798750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 15:14:53.806875 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.806856 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg"] Apr 16 15:14:53.847052 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.847028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-85hsg\" (UID: \"baaa7282-7e35-4700-988e-58aef4e5d8f4\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:53.948366 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:53.948332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-85hsg\" (UID: \"baaa7282-7e35-4700-988e-58aef4e5d8f4\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:53.948614 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:53.948487 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 15:14:53.948691 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:53.948675 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert podName:baaa7282-7e35-4700-988e-58aef4e5d8f4 nodeName:}" failed. No retries permitted until 2026-04-16 15:14:54.448652028 +0000 UTC m=+187.888192588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-85hsg" (UID: "baaa7282-7e35-4700-988e-58aef4e5d8f4") : secret "monitoring-plugin-cert" not found Apr 16 15:14:54.452372 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:54.452344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-85hsg\" (UID: \"baaa7282-7e35-4700-988e-58aef4e5d8f4\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:54.454893 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:54.454869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/baaa7282-7e35-4700-988e-58aef4e5d8f4-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-85hsg\" (UID: \"baaa7282-7e35-4700-988e-58aef4e5d8f4\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:54.704884 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:54.704797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:54.819092 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:54.819062 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg"] Apr 16 15:14:54.823711 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:54.823679 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaaa7282_7e35_4700_988e_58aef4e5d8f4.slice/crio-8ed01acc9a6278ca3a26112704018fce098edb507c68cd628b9f1a4773f24b03 WatchSource:0}: Error finding container 8ed01acc9a6278ca3a26112704018fce098edb507c68cd628b9f1a4773f24b03: Status 404 returned error can't find the container with id 8ed01acc9a6278ca3a26112704018fce098edb507c68cd628b9f1a4773f24b03 Apr 16 15:14:55.354153 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.354122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:14:55.358568 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.358549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.362545 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.362522 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 15:14:55.362641 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.362568 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 15:14:55.362641 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.362600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 15:14:55.362949 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.362904 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 15:14:55.363089 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.363043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 15:14:55.363220 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.363197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 15:14:55.363689 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.363671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 15:14:55.363779 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.363755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 15:14:55.364070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.364052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7s34jfjm6140i\"" Apr 16 15:14:55.364190 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.364172 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-d7rvz\"" Apr 16 15:14:55.364250 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.364208 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 15:14:55.364250 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.364116 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 15:14:55.364364 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.364058 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 15:14:55.365294 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.365149 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 15:14:55.367740 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.367724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 15:14:55.387904 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.387877 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:14:55.460710 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.460864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.460864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.460864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.460864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.460864 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.460942 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8582\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.461384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.461273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562703 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.562855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.562987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8582\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.563461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.563257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.564536 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.564064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.565060 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.565032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.566666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.565934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.566666 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.566286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.569016 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.568991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.570151 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.569979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.570151 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.570029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.570151 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.570123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.570374 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.570226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.570626 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.570605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.571053 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.571027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.571850 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.571713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.575750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.575728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.576234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.576216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.576365 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.576260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.576365 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.576309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.577946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.577923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.578271 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.578249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8582\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582\") pod \"prometheus-k8s-0\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.669711 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.669633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:14:55.784652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.784602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" event={"ID":"baaa7282-7e35-4700-988e-58aef4e5d8f4","Type":"ContainerStarted","Data":"8ed01acc9a6278ca3a26112704018fce098edb507c68cd628b9f1a4773f24b03"} Apr 16 15:14:55.814556 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:55.814530 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:14:56.067306 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:14:56.067268 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c81635d_7a26_49be_bc0a_5604361cadc1.slice/crio-68aab3ce3d5f38f74a937989c4cf32c471c446fab4185f4c2c314feed49ac5eb WatchSource:0}: Error finding container 68aab3ce3d5f38f74a937989c4cf32c471c446fab4185f4c2c314feed49ac5eb: Status 404 returned error can't find the container with id 68aab3ce3d5f38f74a937989c4cf32c471c446fab4185f4c2c314feed49ac5eb Apr 16 15:14:56.788889 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:56.788823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" event={"ID":"baaa7282-7e35-4700-988e-58aef4e5d8f4","Type":"ContainerStarted","Data":"5094c8cb8a33145da656e13d1365f2497e4980373f78c8dda065cb035cd5cb8c"} Apr 16 15:14:56.789161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:56.789129 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:56.790237 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:56.790203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"68aab3ce3d5f38f74a937989c4cf32c471c446fab4185f4c2c314feed49ac5eb"} Apr 16 15:14:56.794485 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:56.794450 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" Apr 16 15:14:56.813255 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:56.813215 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-85hsg" podStartSLOduration=2.527589593 podStartE2EDuration="3.81320089s" podCreationTimestamp="2026-04-16 15:14:53 +0000 UTC" firstStartedPulling="2026-04-16 15:14:54.825460236 +0000 UTC m=+188.265000773" lastFinishedPulling="2026-04-16 15:14:56.111071529 +0000 UTC m=+189.550612070" observedRunningTime="2026-04-16 15:14:56.811936949 +0000 UTC m=+190.251477510" watchObservedRunningTime="2026-04-16 15:14:56.81320089 +0000 UTC m=+190.252741452" Apr 16 15:14:57.084455 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.084342 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" containerName="registry" containerID="cri-o://af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b" gracePeriod=30 Apr 16 15:14:57.348144 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.348121 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:57.500599 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500568 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.500749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500632 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.500812 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500798 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.500898 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.500958 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500924 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.500958 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.501065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500961 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:57.501065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.500987 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnpth\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.501065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.501010 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") pod \"33e69575-fca9-4bac-82da-e821111ee7d3\" (UID: \"33e69575-fca9-4bac-82da-e821111ee7d3\") " Apr 16 15:14:57.501291 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.501260 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-trusted-ca\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.501519 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.501480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:57.503583 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.503555 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:14:57.503668 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.503619 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:14:57.503668 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.503621 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth" (OuterVolumeSpecName: "kube-api-access-qnpth") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "kube-api-access-qnpth". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:57.503668 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.503634 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:57.503847 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.503829 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:14:57.509779 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.509758 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "33e69575-fca9-4bac-82da-e821111ee7d3" (UID: "33e69575-fca9-4bac-82da-e821111ee7d3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:57.601678 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601629 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-image-registry-private-configuration\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601678 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601650 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33e69575-fca9-4bac-82da-e821111ee7d3-installation-pull-secrets\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601678 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601662 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33e69575-fca9-4bac-82da-e821111ee7d3-registry-certificates\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601678 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601672 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33e69575-fca9-4bac-82da-e821111ee7d3-ca-trust-extracted\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601845 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601682 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-bound-sa-token\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601845 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601690 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qnpth\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-kube-api-access-qnpth\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.601845 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.601699 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33e69575-fca9-4bac-82da-e821111ee7d3-registry-tls\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:14:57.793533 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.793503 2575 generic.go:358] "Generic (PLEG): container finished" podID="33e69575-fca9-4bac-82da-e821111ee7d3" containerID="af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b" exitCode=0 Apr 16 15:14:57.793701 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.793562 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" Apr 16 15:14:57.793701 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.793576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" event={"ID":"33e69575-fca9-4bac-82da-e821111ee7d3","Type":"ContainerDied","Data":"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b"} Apr 16 15:14:57.793701 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.793604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4dff87c-np7zj" event={"ID":"33e69575-fca9-4bac-82da-e821111ee7d3","Type":"ContainerDied","Data":"2cbf2382a9e5b3116e6cd2377b851b169e8b2e547e5a903e18ed462cb2eec37b"} Apr 16 15:14:57.793701 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.793625 2575 scope.go:117] "RemoveContainer" containerID="af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b" Apr 16 15:14:57.794970 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.794947 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="3b1c9cae6eb83e47dcb4fa54828de7ca5ab5c13c4a7028d16454a9e084e2cf51" exitCode=0 Apr 16 15:14:57.795077 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.795016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"3b1c9cae6eb83e47dcb4fa54828de7ca5ab5c13c4a7028d16454a9e084e2cf51"} Apr 16 15:14:57.806633 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.806615 2575 scope.go:117] "RemoveContainer" containerID="af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b" Apr 16 15:14:57.808523 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:14:57.808490 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b\": container with ID starting with af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b not found: ID does not exist" containerID="af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b" Apr 16 15:14:57.808618 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.808535 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b"} err="failed to get container status \"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b\": rpc error: code = NotFound desc = could not find container \"af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b\": container with ID starting with af67bde0a28c8273b11b055cf049981be82836c5374def7e0000ce14c03da61b not found: ID does not exist" Apr 16 15:14:57.849289 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.849262 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:57.858924 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:57.858864 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b4dff87c-np7zj"] Apr 16 15:14:59.151291 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:14:59.151257 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" path="/var/lib/kubelet/pods/33e69575-fca9-4bac-82da-e821111ee7d3/volumes" Apr 16 15:15:00.808020 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:00.807942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"5b734a7df38e41a109220986c4d14ed118b99e132eb2a3ff6eddf39a6dc005fd"} Apr 16 15:15:00.808020 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:00.807975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"6813175824afe27daf4ca3755c89a8b3b01d8bc7ff309a39af8439d6519af6b7"} Apr 16 15:15:02.817119 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:02.817085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"1bc299580a1126dd103d36d2a5334ed2f27f44a858747420774f9d2ee844b53b"} Apr 16 15:15:02.817119 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:02.817121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"915050eeae71552dc31b2141b4025a18fbb68e938df27b809a8dcca05c51b0f3"} Apr 16 15:15:02.817530 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:02.817132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"6b31a865b483a81985ae45748de9d24b219a92cf64b7f8c1831986212721ff58"} Apr 16 15:15:02.817530 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:02.817141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerStarted","Data":"2f050764615c00aaf3365cdd7caed88a943e9582ab97d57862c162a38f070bbb"} Apr 16 15:15:02.844412 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:02.844358 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.6638430400000002 podStartE2EDuration="7.844340997s" podCreationTimestamp="2026-04-16 15:14:55 +0000 UTC" firstStartedPulling="2026-04-16 15:14:56.069103316 +0000 UTC m=+189.508643855" lastFinishedPulling="2026-04-16 15:15:02.249601268 +0000 UTC m=+195.689141812" observedRunningTime="2026-04-16 15:15:02.842698654 +0000 UTC m=+196.282239214" watchObservedRunningTime="2026-04-16 15:15:02.844340997 +0000 UTC m=+196.283881558" Apr 16 15:15:05.670296 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:05.670266 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:15:20.865937 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:20.865904 2575 generic.go:358] "Generic (PLEG): container finished" podID="6267389f-eb5c-42ff-b628-bc0a9de3cc95" containerID="febf20f65131e6aad082ec321180c81025c33af708c856f3fd5b50495ade80df" exitCode=0 Apr 16 15:15:20.866501 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:20.865944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" event={"ID":"6267389f-eb5c-42ff-b628-bc0a9de3cc95","Type":"ContainerDied","Data":"febf20f65131e6aad082ec321180c81025c33af708c856f3fd5b50495ade80df"} Apr 16 15:15:20.866501 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:20.866200 2575 scope.go:117] "RemoveContainer" containerID="febf20f65131e6aad082ec321180c81025c33af708c856f3fd5b50495ade80df" Apr 16 15:15:21.870858 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:21.870824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-6q5dn" event={"ID":"6267389f-eb5c-42ff-b628-bc0a9de3cc95","Type":"ContainerStarted","Data":"e09c51fd541c655145e4f9bf8890f9f5df986ec0b4b484bad4cddbcac5407777"} Apr 16 15:15:55.669812 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:55.669776 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:15:55.688459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:55.688411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:15:55.978449 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:55.978407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:15:57.882819 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:57.882781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:15:57.885027 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:57.885008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9973bf97-babd-47b9-a129-38dbed119c77-metrics-certs\") pod \"network-metrics-daemon-whwdh\" (UID: \"9973bf97-babd-47b9-a129-38dbed119c77\") " pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:15:57.952940 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:57.952908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5mf5n\"" Apr 16 15:15:57.961435 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:57.961407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-whwdh" Apr 16 15:15:58.079818 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:58.079786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-whwdh"] Apr 16 15:15:58.084066 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:15:58.084039 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9973bf97_babd_47b9_a129_38dbed119c77.slice/crio-de2b9e76c649c2ab7102d74bf495fc654ba2a58f42e79b7260c8565c114807c5 WatchSource:0}: Error finding container de2b9e76c649c2ab7102d74bf495fc654ba2a58f42e79b7260c8565c114807c5: Status 404 returned error can't find the container with id de2b9e76c649c2ab7102d74bf495fc654ba2a58f42e79b7260c8565c114807c5 Apr 16 15:15:58.977723 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:58.977678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-whwdh" event={"ID":"9973bf97-babd-47b9-a129-38dbed119c77","Type":"ContainerStarted","Data":"de2b9e76c649c2ab7102d74bf495fc654ba2a58f42e79b7260c8565c114807c5"} Apr 16 15:15:59.982117 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:59.982079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-whwdh" event={"ID":"9973bf97-babd-47b9-a129-38dbed119c77","Type":"ContainerStarted","Data":"97cac08dfd007d50670c5cbb6a3e9cfe219a1b0a88242d46697cb7a340ce07f0"} Apr 16 15:15:59.982507 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:59.982123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-whwdh" event={"ID":"9973bf97-babd-47b9-a129-38dbed119c77","Type":"ContainerStarted","Data":"434cd1eabcfd41d42e9c132eb84a6c5a92de5f1ebd74d3927017b87be8328a6f"} Apr 16 15:15:59.999718 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:15:59.999669 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-whwdh" podStartSLOduration=251.961106893 podStartE2EDuration="4m12.999650455s" podCreationTimestamp="2026-04-16 15:11:47 +0000 UTC" firstStartedPulling="2026-04-16 15:15:58.086263085 +0000 UTC m=+251.525803625" lastFinishedPulling="2026-04-16 15:15:59.124806636 +0000 UTC m=+252.564347187" observedRunningTime="2026-04-16 15:15:59.998726223 +0000 UTC m=+253.438266790" watchObservedRunningTime="2026-04-16 15:15:59.999650455 +0000 UTC m=+253.439191018" Apr 16 15:16:13.791491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791397 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:13.791939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791809 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="prometheus" containerID="cri-o://6813175824afe27daf4ca3755c89a8b3b01d8bc7ff309a39af8439d6519af6b7" gracePeriod=600 Apr 16 15:16:13.791939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791836 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy" containerID="cri-o://915050eeae71552dc31b2141b4025a18fbb68e938df27b809a8dcca05c51b0f3" gracePeriod=600 Apr 16 15:16:13.791939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791851 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1bc299580a1126dd103d36d2a5334ed2f27f44a858747420774f9d2ee844b53b" gracePeriod=600 Apr 16 15:16:13.791939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791889 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="thanos-sidecar" containerID="cri-o://2f050764615c00aaf3365cdd7caed88a943e9582ab97d57862c162a38f070bbb" gracePeriod=600 Apr 16 15:16:13.791939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791898 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-web" containerID="cri-o://6b31a865b483a81985ae45748de9d24b219a92cf64b7f8c1831986212721ff58" gracePeriod=600 Apr 16 15:16:13.792241 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:13.791913 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="config-reloader" containerID="cri-o://5b734a7df38e41a109220986c4d14ed118b99e132eb2a3ff6eddf39a6dc005fd" gracePeriod=600 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023607 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="1bc299580a1126dd103d36d2a5334ed2f27f44a858747420774f9d2ee844b53b" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023628 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="915050eeae71552dc31b2141b4025a18fbb68e938df27b809a8dcca05c51b0f3" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023634 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="6b31a865b483a81985ae45748de9d24b219a92cf64b7f8c1831986212721ff58" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023640 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="2f050764615c00aaf3365cdd7caed88a943e9582ab97d57862c162a38f070bbb" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023645 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="5b734a7df38e41a109220986c4d14ed118b99e132eb2a3ff6eddf39a6dc005fd" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023654 2575 generic.go:358] "Generic (PLEG): container finished" podID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerID="6813175824afe27daf4ca3755c89a8b3b01d8bc7ff309a39af8439d6519af6b7" exitCode=0 Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"1bc299580a1126dd103d36d2a5334ed2f27f44a858747420774f9d2ee844b53b"} Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"915050eeae71552dc31b2141b4025a18fbb68e938df27b809a8dcca05c51b0f3"} Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"6b31a865b483a81985ae45748de9d24b219a92cf64b7f8c1831986212721ff58"} Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"2f050764615c00aaf3365cdd7caed88a943e9582ab97d57862c162a38f070bbb"} Apr 16 15:16:14.023745 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"5b734a7df38e41a109220986c4d14ed118b99e132eb2a3ff6eddf39a6dc005fd"} Apr 16 15:16:14.024096 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.023759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"6813175824afe27daf4ca3755c89a8b3b01d8bc7ff309a39af8439d6519af6b7"} Apr 16 15:16:14.029751 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.029732 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:14.205684 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205656 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.205821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205693 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.205821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205715 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.205821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205743 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.205821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205776 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.205821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205866 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205892 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205916 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205953 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.205986 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206036 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206071 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206098 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206128 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:14.206516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206140 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8582\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206214 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206244 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets\") pod \"4c81635d-7a26-49be-bc0a-5604361cadc1\" (UID: \"4c81635d-7a26-49be-bc0a-5604361cadc1\") " Apr 16 15:16:14.206768 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206585 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:14.206768 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206643 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.206768 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.206713 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:14.207610 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.207118 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:14.208661 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.208634 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582" (OuterVolumeSpecName: "kube-api-access-j8582") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "kube-api-access-j8582". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:14.208661 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.208649 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.208978 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.208944 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.209064 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.208976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:16:14.209127 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209080 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.209185 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209105 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.209381 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209355 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:14.209517 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209495 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:16:14.209808 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209783 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.209871 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.209854 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.210990 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.210961 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config" (OuterVolumeSpecName: "config") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.210990 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.210965 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out" (OuterVolumeSpecName: "config-out") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:14.211107 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.211020 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.220746 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.220725 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config" (OuterVolumeSpecName: "web-config") pod "4c81635d-7a26-49be-bc0a-5604361cadc1" (UID: "4c81635d-7a26-49be-bc0a-5604361cadc1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:16:14.307118 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307098 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307118 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307118 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-k8s-db\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307127 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307136 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-metrics-client-certs\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307146 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307157 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-grpc-tls\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307164 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-web-config\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307173 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c81635d-7a26-49be-bc0a-5604361cadc1-config-out\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307181 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307190 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-kube-rbac-proxy\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307198 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307208 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307217 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-config\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307225 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c81635d-7a26-49be-bc0a-5604361cadc1-configmap-metrics-client-ca\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307234 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8582\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-kube-api-access-j8582\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307625 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307242 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c81635d-7a26-49be-bc0a-5604361cadc1-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:14.307625 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:14.307251 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c81635d-7a26-49be-bc0a-5604361cadc1-tls-assets\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:16:15.028711 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.028675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c81635d-7a26-49be-bc0a-5604361cadc1","Type":"ContainerDied","Data":"68aab3ce3d5f38f74a937989c4cf32c471c446fab4185f4c2c314feed49ac5eb"} Apr 16 15:16:15.029078 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.028725 2575 scope.go:117] "RemoveContainer" containerID="1bc299580a1126dd103d36d2a5334ed2f27f44a858747420774f9d2ee844b53b" Apr 16 15:16:15.029078 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.028800 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.036220 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.036204 2575 scope.go:117] "RemoveContainer" containerID="915050eeae71552dc31b2141b4025a18fbb68e938df27b809a8dcca05c51b0f3" Apr 16 15:16:15.042886 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.042869 2575 scope.go:117] "RemoveContainer" containerID="6b31a865b483a81985ae45748de9d24b219a92cf64b7f8c1831986212721ff58" Apr 16 15:16:15.049177 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.049163 2575 scope.go:117] "RemoveContainer" containerID="2f050764615c00aaf3365cdd7caed88a943e9582ab97d57862c162a38f070bbb" Apr 16 15:16:15.053707 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.053685 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:15.055443 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.055413 2575 scope.go:117] "RemoveContainer" containerID="5b734a7df38e41a109220986c4d14ed118b99e132eb2a3ff6eddf39a6dc005fd" Apr 16 15:16:15.059596 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.059549 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:15.063070 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.063054 2575 scope.go:117] "RemoveContainer" containerID="6813175824afe27daf4ca3755c89a8b3b01d8bc7ff309a39af8439d6519af6b7" Apr 16 15:16:15.069654 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.069639 2575 scope.go:117] "RemoveContainer" containerID="3b1c9cae6eb83e47dcb4fa54828de7ca5ab5c13c4a7028d16454a9e084e2cf51" Apr 16 15:16:15.085030 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085012 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:15.085285 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085273 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="prometheus" Apr 16 15:16:15.085327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085287 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="prometheus" Apr 16 15:16:15.085327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085298 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="config-reloader" Apr 16 15:16:15.085327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085304 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="config-reloader" Apr 16 15:16:15.085327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085311 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy" Apr 16 15:16:15.085327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085316 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085350 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-web" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085356 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-web" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085366 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="thanos-sidecar" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085371 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="thanos-sidecar" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085378 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="init-config-reloader" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085384 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="init-config-reloader" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085390 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-thanos" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085396 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-thanos" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085405 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" containerName="registry" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085410 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" containerName="registry" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085473 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="33e69575-fca9-4bac-82da-e821111ee7d3" containerName="registry" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085483 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="thanos-sidecar" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085489 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-web" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085497 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="config-reloader" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085503 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085509 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="prometheus" Apr 16 15:16:15.085529 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.085515 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" containerName="kube-rbac-proxy-thanos" Apr 16 15:16:15.090652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.090636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.093197 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093176 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7s34jfjm6140i\"" Apr 16 15:16:15.093381 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 15:16:15.093481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 15:16:15.093712 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 15:16:15.093830 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093739 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 15:16:15.093830 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 15:16:15.093994 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093793 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 15:16:15.093994 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093858 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 15:16:15.094102 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.093997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 15:16:15.094102 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.094039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 15:16:15.094102 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.094074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 15:16:15.094246 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.094114 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 15:16:15.094246 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.094133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-d7rvz\"" Apr 16 15:16:15.095941 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.095919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 15:16:15.098369 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.098350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 15:16:15.103124 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.103102 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:15.150302 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.150272 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c81635d-7a26-49be-bc0a-5604361cadc1" path="/var/lib/kubelet/pods/4c81635d-7a26-49be-bc0a-5604361cadc1/volumes" Apr 16 15:16:15.212466 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212571 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212571 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212571 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.212844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.213050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.213050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.213050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.213050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.212945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qp2s\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-kube-api-access-8qp2s\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.313863 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.313793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.313863 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.313834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314038 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.313948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314038 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.313977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314038 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314038 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314239 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314239 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314340 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314340 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314340 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qp2s\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-kube-api-access-8qp2s\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314769 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.314909 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.314766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.315410 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.315388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.317191 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.317284 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-config\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.317284 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318098 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318098 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318098 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318098 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.317955 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.318197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0e55f10-0500-48e1-b262-7e6164b733bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318342 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.318304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f0e55f10-0500-48e1-b262-7e6164b733bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.318924 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.318899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.319483 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.319457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.320022 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.319997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.320123 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.320029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.320186 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.320131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0e55f10-0500-48e1-b262-7e6164b733bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.325541 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.325522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qp2s\" (UniqueName: \"kubernetes.io/projected/f0e55f10-0500-48e1-b262-7e6164b733bd-kube-api-access-8qp2s\") pod \"prometheus-k8s-0\" (UID: \"f0e55f10-0500-48e1-b262-7e6164b733bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.402786 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.402762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:15.529581 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:15.529560 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 15:16:15.531866 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:16:15.531839 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e55f10_0500_48e1_b262_7e6164b733bd.slice/crio-ae9385f438b46cc0ea3c629b31f31be583b51c75a0e3efd31a4bd289dac1a60e WatchSource:0}: Error finding container ae9385f438b46cc0ea3c629b31f31be583b51c75a0e3efd31a4bd289dac1a60e: Status 404 returned error can't find the container with id ae9385f438b46cc0ea3c629b31f31be583b51c75a0e3efd31a4bd289dac1a60e Apr 16 15:16:16.033589 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:16.033553 2575 generic.go:358] "Generic (PLEG): container finished" podID="f0e55f10-0500-48e1-b262-7e6164b733bd" containerID="904e442190a87bb05f05ba2f9e85d61ba6f2e4376b7a8aef49ee5f6afdb13574" exitCode=0 Apr 16 15:16:16.033945 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:16.033599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerDied","Data":"904e442190a87bb05f05ba2f9e85d61ba6f2e4376b7a8aef49ee5f6afdb13574"} Apr 16 15:16:16.033945 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:16.033625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"ae9385f438b46cc0ea3c629b31f31be583b51c75a0e3efd31a4bd289dac1a60e"} Apr 16 15:16:17.039145 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"0a7bf9fff950ba90111b9a7bd0e09ea486ce009dfc1d37e70f0370cf897a9c6a"} Apr 16 15:16:17.039145 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"7f92e06f33a2d053da2e2bc1cbee91964fa6cc6e2c8736568092d7aab93bf511"} Apr 16 15:16:17.039145 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"7264bf3a738212cfc536ac84da7129f4567605044607971b3f4b8e211d7ed70a"} Apr 16 15:16:17.039618 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"0654c227fbab7c90ab91a4638121c70e759029eaefdd8e829b7de7b5e1093bd7"} Apr 16 15:16:17.039618 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"49e2bb86683f0b520fa0faee214f8c0df3e4974b0627c0e980d168eb8ab21b50"} Apr 16 15:16:17.039618 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.039177 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f0e55f10-0500-48e1-b262-7e6164b733bd","Type":"ContainerStarted","Data":"aa18ba475cde0f8bf5113f260f0235cde2065a3925f3df8a53291e65cfb5890b"} Apr 16 15:16:17.073608 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:17.073554 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.073539389 podStartE2EDuration="2.073539389s" podCreationTimestamp="2026-04-16 15:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:17.070563236 +0000 UTC m=+270.510103795" watchObservedRunningTime="2026-04-16 15:16:17.073539389 +0000 UTC m=+270.513079952" Apr 16 15:16:20.403530 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:20.403490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:16:47.034073 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:47.034046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:16:47.034702 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:47.034404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:16:47.040921 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:16:47.040900 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:17:15.403886 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:17:15.403848 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:17:15.419390 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:17:15.419367 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:17:16.218546 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:17:16.218519 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 15:19:40.473556 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.473517 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb"] Apr 16 15:19:40.476624 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.476606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.479966 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.479942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:19:40.480068 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.479990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-r87nf\"" Apr 16 15:19:40.480110 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.480080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 15:19:40.487355 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.487333 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb"] Apr 16 15:19:40.497129 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.497105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v6p\" (UniqueName: \"kubernetes.io/projected/50b12eb1-0349-4174-91ff-cec8fd00e9df-kube-api-access-p4v6p\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.497238 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.497155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50b12eb1-0349-4174-91ff-cec8fd00e9df-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.597778 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.597734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v6p\" (UniqueName: \"kubernetes.io/projected/50b12eb1-0349-4174-91ff-cec8fd00e9df-kube-api-access-p4v6p\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.597927 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.597798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50b12eb1-0349-4174-91ff-cec8fd00e9df-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.598130 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.598115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50b12eb1-0349-4174-91ff-cec8fd00e9df-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.612852 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.612824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v6p\" (UniqueName: \"kubernetes.io/projected/50b12eb1-0349-4174-91ff-cec8fd00e9df-kube-api-access-p4v6p\") pod \"openshift-lws-operator-bfc7f696d-k8tfb\" (UID: \"50b12eb1-0349-4174-91ff-cec8fd00e9df\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.798693 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.798607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" Apr 16 15:19:40.915914 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.915893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb"] Apr 16 15:19:40.917652 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:19:40.917621 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50b12eb1_0349_4174_91ff_cec8fd00e9df.slice/crio-fc4ca6768a306b64d4505d15d667f8adfda650afd821b9a78e216e6bb7f336f3 WatchSource:0}: Error finding container fc4ca6768a306b64d4505d15d667f8adfda650afd821b9a78e216e6bb7f336f3: Status 404 returned error can't find the container with id fc4ca6768a306b64d4505d15d667f8adfda650afd821b9a78e216e6bb7f336f3 Apr 16 15:19:40.919011 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:40.918995 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:19:41.598702 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:41.598667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" event={"ID":"50b12eb1-0349-4174-91ff-cec8fd00e9df","Type":"ContainerStarted","Data":"fc4ca6768a306b64d4505d15d667f8adfda650afd821b9a78e216e6bb7f336f3"} Apr 16 15:19:44.608601 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:44.608559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" event={"ID":"50b12eb1-0349-4174-91ff-cec8fd00e9df","Type":"ContainerStarted","Data":"53b2e779e947b41c128b9c6b9268872f81506ac0393542fcc16966b40f0d8924"} Apr 16 15:19:44.627616 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:19:44.627571 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8tfb" podStartSLOduration=1.959784771 podStartE2EDuration="4.627552599s" podCreationTimestamp="2026-04-16 15:19:40 +0000 UTC" firstStartedPulling="2026-04-16 15:19:40.919116276 +0000 UTC m=+474.358656815" lastFinishedPulling="2026-04-16 15:19:43.58688409 +0000 UTC m=+477.026424643" observedRunningTime="2026-04-16 15:19:44.626084376 +0000 UTC m=+478.065624936" watchObservedRunningTime="2026-04-16 15:19:44.627552599 +0000 UTC m=+478.067093159" Apr 16 15:20:03.158570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.158532 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6"] Apr 16 15:20:03.167726 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.167685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.173151 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.173118 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 15:20:03.173287 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.173162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tfg64\"" Apr 16 15:20:03.173646 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.173627 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 15:20:03.173971 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.173956 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 15:20:03.176576 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.176554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 15:20:03.177844 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.177824 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6"] Apr 16 15:20:03.271954 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.271919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhtv\" (UniqueName: \"kubernetes.io/projected/a0decef0-2289-44d3-b69e-9006bab3f5ee-kube-api-access-9nhtv\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.272117 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.271973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.272117 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.272002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.372866 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.372833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhtv\" (UniqueName: \"kubernetes.io/projected/a0decef0-2289-44d3-b69e-9006bab3f5ee-kube-api-access-9nhtv\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.373068 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.372900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.373068 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.372929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.375405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.375377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-webhook-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.375541 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.375523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0decef0-2289-44d3-b69e-9006bab3f5ee-apiservice-cert\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.383033 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.383010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhtv\" (UniqueName: \"kubernetes.io/projected/a0decef0-2289-44d3-b69e-9006bab3f5ee-kube-api-access-9nhtv\") pod \"opendatahub-operator-controller-manager-68df4b58f7-khzn6\" (UID: \"a0decef0-2289-44d3-b69e-9006bab3f5ee\") " pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.478692 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.478663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:03.637043 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.636888 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6"] Apr 16 15:20:03.640635 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:20:03.640599 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0decef0_2289_44d3_b69e_9006bab3f5ee.slice/crio-9d8a8e44c9cdf372e62cd8982e76c6aa2960034f06cedc0f7f7bfc2d12915142 WatchSource:0}: Error finding container 9d8a8e44c9cdf372e62cd8982e76c6aa2960034f06cedc0f7f7bfc2d12915142: Status 404 returned error can't find the container with id 9d8a8e44c9cdf372e62cd8982e76c6aa2960034f06cedc0f7f7bfc2d12915142 Apr 16 15:20:03.662260 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.662216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" event={"ID":"a0decef0-2289-44d3-b69e-9006bab3f5ee","Type":"ContainerStarted","Data":"9d8a8e44c9cdf372e62cd8982e76c6aa2960034f06cedc0f7f7bfc2d12915142"} Apr 16 15:20:03.812656 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.812548 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc"] Apr 16 15:20:03.815728 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.815706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.818780 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.818752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 15:20:03.818903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.818780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gqln7\"" Apr 16 15:20:03.818903 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.818827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 15:20:03.819030 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.819017 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 15:20:03.832777 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.832750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc"] Apr 16 15:20:03.877910 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.877889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-manager-config\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.878033 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.877919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6hhp\" (UniqueName: \"kubernetes.io/projected/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-kube-api-access-d6hhp\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.878033 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.877943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.878033 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.878005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.979176 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.979146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-manager-config\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.979312 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.979182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6hhp\" (UniqueName: \"kubernetes.io/projected/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-kube-api-access-d6hhp\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.979312 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.979206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.979312 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.979235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.979931 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.979911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-manager-config\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.981677 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.981658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.981739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.981686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-metrics-cert\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:03.991063 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:03.991044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6hhp\" (UniqueName: \"kubernetes.io/projected/f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07-kube-api-access-d6hhp\") pod \"lws-controller-manager-6fc585dfcd-fdfmc\" (UID: \"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07\") " pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:04.124772 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:04.124713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:04.260451 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:04.260391 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc"] Apr 16 15:20:04.262974 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:20:04.262929 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9bbda6a_86d0_46aa_bf73_a4e09cb4ea07.slice/crio-fc46d8bff95673cec273062782bf065f46478d801b5401e148ac384159633ae8 WatchSource:0}: Error finding container fc46d8bff95673cec273062782bf065f46478d801b5401e148ac384159633ae8: Status 404 returned error can't find the container with id fc46d8bff95673cec273062782bf065f46478d801b5401e148ac384159633ae8 Apr 16 15:20:04.667110 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:04.667072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" event={"ID":"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07","Type":"ContainerStarted","Data":"fc46d8bff95673cec273062782bf065f46478d801b5401e148ac384159633ae8"} Apr 16 15:20:06.679450 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:06.679340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" event={"ID":"a0decef0-2289-44d3-b69e-9006bab3f5ee","Type":"ContainerStarted","Data":"d401d1ea8fc63fb34b779e28765bd2c41b0c9bc3c7a47ea47158cd4619b478dc"} Apr 16 15:20:06.709234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:06.709181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" podStartSLOduration=1.108378634 podStartE2EDuration="3.709164513s" podCreationTimestamp="2026-04-16 15:20:03 +0000 UTC" firstStartedPulling="2026-04-16 15:20:03.642579616 +0000 UTC m=+497.082120155" lastFinishedPulling="2026-04-16 15:20:06.243365496 +0000 UTC m=+499.682906034" observedRunningTime="2026-04-16 15:20:06.708484798 +0000 UTC m=+500.148025360" watchObservedRunningTime="2026-04-16 15:20:06.709164513 +0000 UTC m=+500.148705074" Apr 16 15:20:07.684116 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:07.684077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" event={"ID":"f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07","Type":"ContainerStarted","Data":"fb68941d5d63ecc8f3c533e7b835512f53f03ebbf7153bf7d37e96273e0dde28"} Apr 16 15:20:07.684518 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:07.684151 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:07.684518 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:07.684172 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:07.706915 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:07.706869 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" podStartSLOduration=1.911184169 podStartE2EDuration="4.706854863s" podCreationTimestamp="2026-04-16 15:20:03 +0000 UTC" firstStartedPulling="2026-04-16 15:20:04.265258831 +0000 UTC m=+497.704799373" lastFinishedPulling="2026-04-16 15:20:07.060929519 +0000 UTC m=+500.500470067" observedRunningTime="2026-04-16 15:20:07.705876683 +0000 UTC m=+501.145417242" watchObservedRunningTime="2026-04-16 15:20:07.706854863 +0000 UTC m=+501.146395423" Apr 16 15:20:18.689487 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:18.689455 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6fc585dfcd-fdfmc" Apr 16 15:20:18.689855 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:18.689521 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-68df4b58f7-khzn6" Apr 16 15:20:58.469097 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.469025 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg"] Apr 16 15:20:58.478472 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.478447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.481500 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.481468 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-fsdrx\"" Apr 16 15:20:58.481500 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.481486 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 15:20:58.481761 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.481744 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 15:20:58.481819 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.481792 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 15:20:58.487811 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.487789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg"] Apr 16 15:20:58.507361 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/045c7f45-5e91-4566-9224-a16f19f43f4d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.507729 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.507631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92mp\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-kube-api-access-f92mp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608564 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/045c7f45-5e91-4566-9224-a16f19f43f4d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.608947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f92mp\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-kube-api-access-f92mp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.609100 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.608981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.609100 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.609087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.609179 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.609158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.609402 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.609369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.609705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.609685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/045c7f45-5e91-4566-9224-a16f19f43f4d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.611081 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.611058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.611448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.611410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.616488 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.616466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.616592 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.616575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92mp\" (UniqueName: \"kubernetes.io/projected/045c7f45-5e91-4566-9224-a16f19f43f4d-kube-api-access-f92mp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg\" (UID: \"045c7f45-5e91-4566-9224-a16f19f43f4d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.791520 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.791435 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:20:58.917162 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:58.917128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg"] Apr 16 15:20:58.920876 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:20:58.920838 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045c7f45_5e91_4566_9224_a16f19f43f4d.slice/crio-9f090a3cf8fbda308d05c0575e98e0279e7c3bd5d6bab05197f35c21c6f08929 WatchSource:0}: Error finding container 9f090a3cf8fbda308d05c0575e98e0279e7c3bd5d6bab05197f35c21c6f08929: Status 404 returned error can't find the container with id 9f090a3cf8fbda308d05c0575e98e0279e7c3bd5d6bab05197f35c21c6f08929 Apr 16 15:20:59.848940 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:20:59.848899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" event={"ID":"045c7f45-5e91-4566-9224-a16f19f43f4d","Type":"ContainerStarted","Data":"9f090a3cf8fbda308d05c0575e98e0279e7c3bd5d6bab05197f35c21c6f08929"} Apr 16 15:21:01.533686 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:01.533636 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:21:01.533975 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:01.533717 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:21:01.533975 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:01.533753 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:21:01.856997 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:01.856908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" event={"ID":"045c7f45-5e91-4566-9224-a16f19f43f4d","Type":"ContainerStarted","Data":"5d7cb7f747e85dda872acc3845fbce941f95c4a774002a793e1d324b3aea9860"} Apr 16 15:21:01.879200 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:01.879146 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" podStartSLOduration=1.268460403 podStartE2EDuration="3.879129175s" podCreationTimestamp="2026-04-16 15:20:58 +0000 UTC" firstStartedPulling="2026-04-16 15:20:58.922698098 +0000 UTC m=+552.362238636" lastFinishedPulling="2026-04-16 15:21:01.533366866 +0000 UTC m=+554.972907408" observedRunningTime="2026-04-16 15:21:01.87714855 +0000 UTC m=+555.316689110" watchObservedRunningTime="2026-04-16 15:21:01.879129175 +0000 UTC m=+555.318669736" Apr 16 15:21:02.792008 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:02.791974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:21:02.796448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:02.796399 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:21:02.860195 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:02.860169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:21:02.861087 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:02.861068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg" Apr 16 15:21:21.786201 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.786167 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:21.789609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.789593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:21.792528 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.792506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:21:21.792682 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.792663 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-b8xd2\"" Apr 16 15:21:21.793747 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.793732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:21:21.797562 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.797540 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:21.902272 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:21.902239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrqtt\" (UniqueName: \"kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt\") pod \"kuadrant-operator-catalog-cx228\" (UID: \"1a001c62-8cbf-4beb-9c3a-73aef69265ab\") " pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:22.003181 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.003145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrqtt\" (UniqueName: \"kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt\") pod \"kuadrant-operator-catalog-cx228\" (UID: \"1a001c62-8cbf-4beb-9c3a-73aef69265ab\") " pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:22.010917 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.010884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrqtt\" (UniqueName: \"kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt\") pod \"kuadrant-operator-catalog-cx228\" (UID: \"1a001c62-8cbf-4beb-9c3a-73aef69265ab\") " pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:22.099973 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.099882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:22.142281 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.142253 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:22.217027 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.216999 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:22.219407 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:21:22.219380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a001c62_8cbf_4beb_9c3a_73aef69265ab.slice/crio-bb24ea9e94bc759b05348bce4009995a0dd23b1aa6c5371928ba4b1f5b1d6ba2 WatchSource:0}: Error finding container bb24ea9e94bc759b05348bce4009995a0dd23b1aa6c5371928ba4b1f5b1d6ba2: Status 404 returned error can't find the container with id bb24ea9e94bc759b05348bce4009995a0dd23b1aa6c5371928ba4b1f5b1d6ba2 Apr 16 15:21:22.353100 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.353026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8hsds"] Apr 16 15:21:22.357686 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.357667 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:22.364372 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.364332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8hsds"] Apr 16 15:21:22.507986 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.507957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnbh\" (UniqueName: \"kubernetes.io/projected/081a4915-16f4-4cb3-9eb4-9ac457748dcc-kube-api-access-ktnbh\") pod \"kuadrant-operator-catalog-8hsds\" (UID: \"081a4915-16f4-4cb3-9eb4-9ac457748dcc\") " pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:22.608703 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.608626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnbh\" (UniqueName: \"kubernetes.io/projected/081a4915-16f4-4cb3-9eb4-9ac457748dcc-kube-api-access-ktnbh\") pod \"kuadrant-operator-catalog-8hsds\" (UID: \"081a4915-16f4-4cb3-9eb4-9ac457748dcc\") " pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:22.616850 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.616825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnbh\" (UniqueName: \"kubernetes.io/projected/081a4915-16f4-4cb3-9eb4-9ac457748dcc-kube-api-access-ktnbh\") pod \"kuadrant-operator-catalog-8hsds\" (UID: \"081a4915-16f4-4cb3-9eb4-9ac457748dcc\") " pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:22.668913 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.668889 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:22.796546 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.796516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8hsds"] Apr 16 15:21:22.815168 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:21:22.815124 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081a4915_16f4_4cb3_9eb4_9ac457748dcc.slice/crio-d2679ebb73584f57b4a5dfe2a0b1c6db21a58b921e4f58b88fdd226396f70f68 WatchSource:0}: Error finding container d2679ebb73584f57b4a5dfe2a0b1c6db21a58b921e4f58b88fdd226396f70f68: Status 404 returned error can't find the container with id d2679ebb73584f57b4a5dfe2a0b1c6db21a58b921e4f58b88fdd226396f70f68 Apr 16 15:21:22.927523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.927448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cx228" event={"ID":"1a001c62-8cbf-4beb-9c3a-73aef69265ab","Type":"ContainerStarted","Data":"bb24ea9e94bc759b05348bce4009995a0dd23b1aa6c5371928ba4b1f5b1d6ba2"} Apr 16 15:21:22.928644 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:22.928619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" event={"ID":"081a4915-16f4-4cb3-9eb4-9ac457748dcc","Type":"ContainerStarted","Data":"d2679ebb73584f57b4a5dfe2a0b1c6db21a58b921e4f58b88fdd226396f70f68"} Apr 16 15:21:24.937165 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:24.937123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cx228" event={"ID":"1a001c62-8cbf-4beb-9c3a-73aef69265ab","Type":"ContainerStarted","Data":"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31"} Apr 16 15:21:24.937599 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:24.937213 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-cx228" podUID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" containerName="registry-server" containerID="cri-o://8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31" gracePeriod=2 Apr 16 15:21:24.938449 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:24.938404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" event={"ID":"081a4915-16f4-4cb3-9eb4-9ac457748dcc","Type":"ContainerStarted","Data":"98f96fe2e0d237c9cd93df64950708a654875f2c06fc3575970ab6cb8407cc43"} Apr 16 15:21:24.953658 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:24.953617 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-cx228" podStartSLOduration=1.903543404 podStartE2EDuration="3.953605715s" podCreationTimestamp="2026-04-16 15:21:21 +0000 UTC" firstStartedPulling="2026-04-16 15:21:22.221106731 +0000 UTC m=+575.660647274" lastFinishedPulling="2026-04-16 15:21:24.271169038 +0000 UTC m=+577.710709585" observedRunningTime="2026-04-16 15:21:24.95189873 +0000 UTC m=+578.391439327" watchObservedRunningTime="2026-04-16 15:21:24.953605715 +0000 UTC m=+578.393146274" Apr 16 15:21:24.967578 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:24.967541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" podStartSLOduration=1.509845079 podStartE2EDuration="2.967529651s" podCreationTimestamp="2026-04-16 15:21:22 +0000 UTC" firstStartedPulling="2026-04-16 15:21:22.816491204 +0000 UTC m=+576.256031741" lastFinishedPulling="2026-04-16 15:21:24.27417577 +0000 UTC m=+577.713716313" observedRunningTime="2026-04-16 15:21:24.966626455 +0000 UTC m=+578.406167037" watchObservedRunningTime="2026-04-16 15:21:24.967529651 +0000 UTC m=+578.407070210" Apr 16 15:21:25.176800 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.176779 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:25.333181 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.333092 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrqtt\" (UniqueName: \"kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt\") pod \"1a001c62-8cbf-4beb-9c3a-73aef69265ab\" (UID: \"1a001c62-8cbf-4beb-9c3a-73aef69265ab\") " Apr 16 15:21:25.335271 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.335246 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt" (OuterVolumeSpecName: "kube-api-access-jrqtt") pod "1a001c62-8cbf-4beb-9c3a-73aef69265ab" (UID: "1a001c62-8cbf-4beb-9c3a-73aef69265ab"). InnerVolumeSpecName "kube-api-access-jrqtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:21:25.433808 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.433771 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrqtt\" (UniqueName: \"kubernetes.io/projected/1a001c62-8cbf-4beb-9c3a-73aef69265ab-kube-api-access-jrqtt\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:21:25.943163 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.943131 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" containerID="8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31" exitCode=0 Apr 16 15:21:25.943558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.943190 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cx228" Apr 16 15:21:25.943558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.943209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cx228" event={"ID":"1a001c62-8cbf-4beb-9c3a-73aef69265ab","Type":"ContainerDied","Data":"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31"} Apr 16 15:21:25.943558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.943243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cx228" event={"ID":"1a001c62-8cbf-4beb-9c3a-73aef69265ab","Type":"ContainerDied","Data":"bb24ea9e94bc759b05348bce4009995a0dd23b1aa6c5371928ba4b1f5b1d6ba2"} Apr 16 15:21:25.943558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.943257 2575 scope.go:117] "RemoveContainer" containerID="8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31" Apr 16 15:21:25.951912 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.951897 2575 scope.go:117] "RemoveContainer" containerID="8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31" Apr 16 15:21:25.952139 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:21:25.952124 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31\": container with ID starting with 8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31 not found: ID does not exist" containerID="8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31" Apr 16 15:21:25.952173 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.952146 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31"} err="failed to get container status \"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31\": rpc error: code = NotFound desc = could not find container \"8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31\": container with ID starting with 8a4a3b6815d523e2d77594596859eb5ccec205a68670ab2050bb233a45176d31 not found: ID does not exist" Apr 16 15:21:25.962983 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.962964 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:25.967057 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:25.967036 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cx228"] Apr 16 15:21:27.151669 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:27.151637 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" path="/var/lib/kubelet/pods/1a001c62-8cbf-4beb-9c3a-73aef69265ab/volumes" Apr 16 15:21:32.669231 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:32.669197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:32.669778 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:32.669273 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:32.690275 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:32.690251 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:32.988198 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:32.988174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-8hsds" Apr 16 15:21:47.058324 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:47.058291 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:21:47.058825 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:47.058473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:21:53.265030 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.264994 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r"] Apr 16 15:21:53.265392 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.265372 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" containerName="registry-server" Apr 16 15:21:53.265392 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.265384 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" containerName="registry-server" Apr 16 15:21:53.265498 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.265456 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a001c62-8cbf-4beb-9c3a-73aef69265ab" containerName="registry-server" Apr 16 15:21:53.267596 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.267580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:53.279126 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.279105 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-dr579\"" Apr 16 15:21:53.279230 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.279105 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 15:21:53.302128 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.302101 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r"] Apr 16 15:21:53.361934 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.361909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwcb\" (UniqueName: \"kubernetes.io/projected/6f81a8e1-37e0-42a5-9fa8-be64e757d67c-kube-api-access-gxwcb\") pod \"dns-operator-controller-manager-648d5c98bc-t5d2r\" (UID: \"6f81a8e1-37e0-42a5-9fa8-be64e757d67c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:53.462592 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.462567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwcb\" (UniqueName: \"kubernetes.io/projected/6f81a8e1-37e0-42a5-9fa8-be64e757d67c-kube-api-access-gxwcb\") pod \"dns-operator-controller-manager-648d5c98bc-t5d2r\" (UID: \"6f81a8e1-37e0-42a5-9fa8-be64e757d67c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:53.476104 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.476077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwcb\" (UniqueName: \"kubernetes.io/projected/6f81a8e1-37e0-42a5-9fa8-be64e757d67c-kube-api-access-gxwcb\") pod \"dns-operator-controller-manager-648d5c98bc-t5d2r\" (UID: \"6f81a8e1-37e0-42a5-9fa8-be64e757d67c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:53.577243 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.577178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:53.716896 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:53.716862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r"] Apr 16 15:21:53.720515 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:21:53.720488 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f81a8e1_37e0_42a5_9fa8_be64e757d67c.slice/crio-4aa68a5eb6108c1bd7f189c6acdbfeec8f3c460359c9e8b433d52eebb2b5c578 WatchSource:0}: Error finding container 4aa68a5eb6108c1bd7f189c6acdbfeec8f3c460359c9e8b433d52eebb2b5c578: Status 404 returned error can't find the container with id 4aa68a5eb6108c1bd7f189c6acdbfeec8f3c460359c9e8b433d52eebb2b5c578 Apr 16 15:21:54.033750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:54.033706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" event={"ID":"6f81a8e1-37e0-42a5-9fa8-be64e757d67c","Type":"ContainerStarted","Data":"4aa68a5eb6108c1bd7f189c6acdbfeec8f3c460359c9e8b433d52eebb2b5c578"} Apr 16 15:21:56.715430 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.715391 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr"] Apr 16 15:21:56.719412 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.719396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:21:56.722362 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.722343 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-hwkkt\"" Apr 16 15:21:56.734384 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.734362 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr"] Apr 16 15:21:56.792625 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.792599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtdt\" (UniqueName: \"kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt\") pod \"limitador-operator-controller-manager-85c4996f8c-s77cr\" (UID: \"0dad1ffb-fd48-4864-a018-ae81473d92ca\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:21:56.893101 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.893065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtdt\" (UniqueName: \"kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt\") pod \"limitador-operator-controller-manager-85c4996f8c-s77cr\" (UID: \"0dad1ffb-fd48-4864-a018-ae81473d92ca\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:21:56.902274 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:56.902245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtdt\" (UniqueName: \"kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt\") pod \"limitador-operator-controller-manager-85c4996f8c-s77cr\" (UID: \"0dad1ffb-fd48-4864-a018-ae81473d92ca\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:21:57.029475 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:57.029370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:21:57.045058 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:57.045028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" event={"ID":"6f81a8e1-37e0-42a5-9fa8-be64e757d67c","Type":"ContainerStarted","Data":"b0e8def424b0f8b228e97d83bd6478e1285ce463edc24b61c6eda8425e3ac7e9"} Apr 16 15:21:57.045220 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:57.045200 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:21:57.089946 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:57.089892 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" podStartSLOduration=1.732788599 podStartE2EDuration="4.089872678s" podCreationTimestamp="2026-04-16 15:21:53 +0000 UTC" firstStartedPulling="2026-04-16 15:21:53.723022171 +0000 UTC m=+607.162562708" lastFinishedPulling="2026-04-16 15:21:56.080106235 +0000 UTC m=+609.519646787" observedRunningTime="2026-04-16 15:21:57.087022849 +0000 UTC m=+610.526563447" watchObservedRunningTime="2026-04-16 15:21:57.089872678 +0000 UTC m=+610.529413274" Apr 16 15:21:57.170851 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:57.170800 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr"] Apr 16 15:21:57.174675 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:21:57.174642 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dad1ffb_fd48_4864_a018_ae81473d92ca.slice/crio-cbfaf03134433046d4ab14006880800b11cc4b5f35990bb59e8888abca0dbeec WatchSource:0}: Error finding container cbfaf03134433046d4ab14006880800b11cc4b5f35990bb59e8888abca0dbeec: Status 404 returned error can't find the container with id cbfaf03134433046d4ab14006880800b11cc4b5f35990bb59e8888abca0dbeec Apr 16 15:21:58.049142 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:21:58.049110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" event={"ID":"0dad1ffb-fd48-4864-a018-ae81473d92ca","Type":"ContainerStarted","Data":"cbfaf03134433046d4ab14006880800b11cc4b5f35990bb59e8888abca0dbeec"} Apr 16 15:22:00.072813 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:00.072772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" event={"ID":"0dad1ffb-fd48-4864-a018-ae81473d92ca","Type":"ContainerStarted","Data":"771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc"} Apr 16 15:22:00.073248 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:00.072906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:22:00.094297 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:00.094241 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" podStartSLOduration=2.002345461 podStartE2EDuration="4.094226s" podCreationTimestamp="2026-04-16 15:21:56 +0000 UTC" firstStartedPulling="2026-04-16 15:21:57.17651618 +0000 UTC m=+610.616056718" lastFinishedPulling="2026-04-16 15:21:59.268396716 +0000 UTC m=+612.707937257" observedRunningTime="2026-04-16 15:22:00.092583254 +0000 UTC m=+613.532123814" watchObservedRunningTime="2026-04-16 15:22:00.094226 +0000 UTC m=+613.533766560" Apr 16 15:22:08.051162 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.051084 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t5d2r" Apr 16 15:22:08.126630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.126597 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7"] Apr 16 15:22:08.129982 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.129951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.133049 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.133022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v7gzl\"" Apr 16 15:22:08.134629 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.134602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7"] Apr 16 15:22:08.186825 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.186802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.186942 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.186861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4czc\" (UniqueName: \"kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.287566 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.287532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4czc\" (UniqueName: \"kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.287713 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.287590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.287947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.287932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.297586 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.297560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4czc\" (UniqueName: \"kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.441164 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.441132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:08.574695 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:08.574662 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7"] Apr 16 15:22:08.576553 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:22:08.576526 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b7fddf_7292_4eff_b971_0f425dc689d0.slice/crio-1d2dae7ef04c6af7d01a5e9b9faf44664cf2ea60911ae79db39ea4ea9d16f314 WatchSource:0}: Error finding container 1d2dae7ef04c6af7d01a5e9b9faf44664cf2ea60911ae79db39ea4ea9d16f314: Status 404 returned error can't find the container with id 1d2dae7ef04c6af7d01a5e9b9faf44664cf2ea60911ae79db39ea4ea9d16f314 Apr 16 15:22:09.103570 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:09.103537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" event={"ID":"23b7fddf-7292-4eff-b971-0f425dc689d0","Type":"ContainerStarted","Data":"1d2dae7ef04c6af7d01a5e9b9faf44664cf2ea60911ae79db39ea4ea9d16f314"} Apr 16 15:22:11.078173 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:11.078145 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:22:15.126565 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:15.126533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" event={"ID":"23b7fddf-7292-4eff-b971-0f425dc689d0","Type":"ContainerStarted","Data":"5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1"} Apr 16 15:22:15.126944 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:15.126589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:15.148302 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:15.148229 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" podStartSLOduration=1.242805115 podStartE2EDuration="7.148213285s" podCreationTimestamp="2026-04-16 15:22:08 +0000 UTC" firstStartedPulling="2026-04-16 15:22:08.581915254 +0000 UTC m=+622.021455806" lastFinishedPulling="2026-04-16 15:22:14.487323438 +0000 UTC m=+627.926863976" observedRunningTime="2026-04-16 15:22:15.146733033 +0000 UTC m=+628.586273595" watchObservedRunningTime="2026-04-16 15:22:15.148213285 +0000 UTC m=+628.587753846" Apr 16 15:22:26.132155 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:26.132125 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:27.088674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.088639 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7"] Apr 16 15:22:27.088881 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.088861 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" containerName="manager" containerID="cri-o://5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1" gracePeriod=2 Apr 16 15:22:27.101178 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.101151 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7"] Apr 16 15:22:27.113650 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.113624 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr"] Apr 16 15:22:27.113906 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.113861 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" containerName="manager" containerID="cri-o://771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc" gracePeriod=2 Apr 16 15:22:27.123810 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.123775 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.124010 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.123987 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr"] Apr 16 15:22:27.133564 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.133543 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:27.134023 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134007 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" containerName="manager" Apr 16 15:22:27.134023 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134024 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" containerName="manager" Apr 16 15:22:27.134140 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134050 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" containerName="manager" Apr 16 15:22:27.134140 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134056 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" containerName="manager" Apr 16 15:22:27.134140 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134107 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" containerName="manager" Apr 16 15:22:27.134140 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.134121 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" containerName="manager" Apr 16 15:22:27.135842 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.135821 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.138061 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.138036 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.142842 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.142818 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z"] Apr 16 15:22:27.145105 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.145087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:27.157705 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.157675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:27.161008 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.160985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z"] Apr 16 15:22:27.243555 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.243529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.243713 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.243570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8phj\" (UniqueName: \"kubernetes.io/projected/d3e39f88-26ca-46bd-85e2-ab2cfe69e539-kube-api-access-g8phj\") pod \"limitador-operator-controller-manager-85c4996f8c-jmk6z\" (UID: \"d3e39f88-26ca-46bd-85e2-ab2cfe69e539\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:27.243796 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.243774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklgb\" (UniqueName: \"kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.344344 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.344322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dklgb\" (UniqueName: \"kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.344490 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.344375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.344490 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.344389 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:27.344490 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.344410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8phj\" (UniqueName: \"kubernetes.io/projected/d3e39f88-26ca-46bd-85e2-ab2cfe69e539-kube-api-access-g8phj\") pod \"limitador-operator-controller-manager-85c4996f8c-jmk6z\" (UID: \"d3e39f88-26ca-46bd-85e2-ab2cfe69e539\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:27.344786 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.344766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.346913 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.346884 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.347455 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.347442 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:22:27.349381 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.349362 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.351187 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.351160 2575 status_manager.go:895] "Failed to get status for pod" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" err="pods \"limitador-operator-controller-manager-85c4996f8c-s77cr\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.352413 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.352394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklgb\" (UniqueName: \"kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cs4wx\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.352510 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.352481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8phj\" (UniqueName: \"kubernetes.io/projected/d3e39f88-26ca-46bd-85e2-ab2cfe69e539-kube-api-access-g8phj\") pod \"limitador-operator-controller-manager-85c4996f8c-jmk6z\" (UID: \"d3e39f88-26ca-46bd-85e2-ab2cfe69e539\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:27.445624 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.445601 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndtdt\" (UniqueName: \"kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt\") pod \"0dad1ffb-fd48-4864-a018-ae81473d92ca\" (UID: \"0dad1ffb-fd48-4864-a018-ae81473d92ca\") " Apr 16 15:22:27.445720 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.445673 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4czc\" (UniqueName: \"kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc\") pod \"23b7fddf-7292-4eff-b971-0f425dc689d0\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " Apr 16 15:22:27.445766 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.445745 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume\") pod \"23b7fddf-7292-4eff-b971-0f425dc689d0\" (UID: \"23b7fddf-7292-4eff-b971-0f425dc689d0\") " Apr 16 15:22:27.446197 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.446173 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "23b7fddf-7292-4eff-b971-0f425dc689d0" (UID: "23b7fddf-7292-4eff-b971-0f425dc689d0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:22:27.447655 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.447635 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt" (OuterVolumeSpecName: "kube-api-access-ndtdt") pod "0dad1ffb-fd48-4864-a018-ae81473d92ca" (UID: "0dad1ffb-fd48-4864-a018-ae81473d92ca"). InnerVolumeSpecName "kube-api-access-ndtdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:22:27.447710 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.447650 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc" (OuterVolumeSpecName: "kube-api-access-t4czc") pod "23b7fddf-7292-4eff-b971-0f425dc689d0" (UID: "23b7fddf-7292-4eff-b971-0f425dc689d0"). InnerVolumeSpecName "kube-api-access-t4czc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:22:27.534960 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.534923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:27.543716 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.543695 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:27.546837 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.546808 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4czc\" (UniqueName: \"kubernetes.io/projected/23b7fddf-7292-4eff-b971-0f425dc689d0-kube-api-access-t4czc\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:22:27.546917 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.546849 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23b7fddf-7292-4eff-b971-0f425dc689d0-extensions-socket-volume\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:22:27.546917 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.546871 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndtdt\" (UniqueName: \"kubernetes.io/projected/0dad1ffb-fd48-4864-a018-ae81473d92ca-kube-api-access-ndtdt\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:22:27.664650 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.664597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:27.668214 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:22:27.668191 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f39adb_80b4_4846_a5ee_c5effb63e096.slice/crio-f478c137805bd719453276af656e868f5130e3d59569aceafcc7927950b6715c WatchSource:0}: Error finding container f478c137805bd719453276af656e868f5130e3d59569aceafcc7927950b6715c: Status 404 returned error can't find the container with id f478c137805bd719453276af656e868f5130e3d59569aceafcc7927950b6715c Apr 16 15:22:27.695733 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.695701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z"] Apr 16 15:22:27.698344 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:22:27.698322 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e39f88_26ca_46bd_85e2_ab2cfe69e539.slice/crio-767e1fa4d98c3525a25510fd98b24e90c2b54b467a02ac640968522c943c9d45 WatchSource:0}: Error finding container 767e1fa4d98c3525a25510fd98b24e90c2b54b467a02ac640968522c943c9d45: Status 404 returned error can't find the container with id 767e1fa4d98c3525a25510fd98b24e90c2b54b467a02ac640968522c943c9d45 Apr 16 15:22:27.720693 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.720670 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:22:27.734534 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.734511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.735045 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.735012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:22:27.737448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.737297 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.741007 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.740975 2575 status_manager.go:895] "Failed to get status for pod" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" err="pods \"limitador-operator-controller-manager-85c4996f8c-s77cr\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:27.850308 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.850217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.850480 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.850320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgknq\" (UniqueName: \"kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.951753 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.951724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgknq\" (UniqueName: \"kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.951896 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.951800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.952098 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.952080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:27.962265 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:27.962235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgknq\" (UniqueName: \"kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:28.051050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.051019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:28.169103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.169069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" event={"ID":"d3e39f88-26ca-46bd-85e2-ab2cfe69e539","Type":"ContainerStarted","Data":"8820417f88a9f630d7e2280cd63b5db1b43edbf2a65d4f6c2558630db8451912"} Apr 16 15:22:28.169103 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.169107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" event={"ID":"d3e39f88-26ca-46bd-85e2-ab2cfe69e539","Type":"ContainerStarted","Data":"767e1fa4d98c3525a25510fd98b24e90c2b54b467a02ac640968522c943c9d45"} Apr 16 15:22:28.169597 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.169185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:28.170206 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.170187 2575 generic.go:358] "Generic (PLEG): container finished" podID="23b7fddf-7292-4eff-b971-0f425dc689d0" containerID="5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1" exitCode=0 Apr 16 15:22:28.170318 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.170228 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" Apr 16 15:22:28.170318 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.170248 2575 scope.go:117] "RemoveContainer" containerID="5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1" Apr 16 15:22:28.171486 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.171459 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:28.171902 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.171870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" event={"ID":"f8f39adb-80b4-4846-a5ee-c5effb63e096","Type":"ContainerStarted","Data":"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f"} Apr 16 15:22:28.171902 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.171895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" event={"ID":"f8f39adb-80b4-4846-a5ee-c5effb63e096","Type":"ContainerStarted","Data":"f478c137805bd719453276af656e868f5130e3d59569aceafcc7927950b6715c"} Apr 16 15:22:28.172031 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.171976 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:28.173107 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.173085 2575 generic.go:358] "Generic (PLEG): container finished" podID="0dad1ffb-fd48-4864-a018-ae81473d92ca" containerID="771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc" exitCode=0 Apr 16 15:22:28.173211 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.173114 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" Apr 16 15:22:28.173484 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.173462 2575 status_manager.go:895] "Failed to get status for pod" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" err="pods \"limitador-operator-controller-manager-85c4996f8c-s77cr\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:28.179962 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.179764 2575 scope.go:117] "RemoveContainer" containerID="5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1" Apr 16 15:22:28.181176 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:22:28.180330 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1\": container with ID starting with 5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1 not found: ID does not exist" containerID="5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1" Apr 16 15:22:28.181176 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.180353 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1"} err="failed to get container status \"5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1\": rpc error: code = NotFound desc = could not find container \"5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1\": container with ID starting with 5d977e2edd68c5c6d00b972b626736fd446d9b117532a9294afa7d7d3b5d79d1 not found: ID does not exist" Apr 16 15:22:28.181176 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.180369 2575 scope.go:117] "RemoveContainer" containerID="771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc" Apr 16 15:22:28.181826 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.181808 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:22:28.182688 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:22:28.182664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe3846a_6b49_41e7_93e5_7dd464eca9c3.slice/crio-bbf9764a6cc09bcd5feba10f8eb25f727b93867a4b2bfc40461ae8406905eb04 WatchSource:0}: Error finding container bbf9764a6cc09bcd5feba10f8eb25f727b93867a4b2bfc40461ae8406905eb04: Status 404 returned error can't find the container with id bbf9764a6cc09bcd5feba10f8eb25f727b93867a4b2bfc40461ae8406905eb04 Apr 16 15:22:28.196448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.195457 2575 status_manager.go:895] "Failed to get status for pod" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s77cr" err="pods \"limitador-operator-controller-manager-85c4996f8c-s77cr\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:28.196448 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.196243 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" podStartSLOduration=1.196228009 podStartE2EDuration="1.196228009s" podCreationTimestamp="2026-04-16 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:22:28.19297788 +0000 UTC m=+641.632518440" watchObservedRunningTime="2026-04-16 15:22:28.196228009 +0000 UTC m=+641.635768571" Apr 16 15:22:28.201914 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.201897 2575 scope.go:117] "RemoveContainer" containerID="771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc" Apr 16 15:22:28.202197 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:22:28.202179 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc\": container with ID starting with 771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc not found: ID does not exist" containerID="771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc" Apr 16 15:22:28.202257 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.202209 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc"} err="failed to get container status \"771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc\": rpc error: code = NotFound desc = could not find container \"771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc\": container with ID starting with 771fcee1267c17dbc64cbbda5e21cdf17a0651f87a0c8ae02f4d592ddefa83fc not found: ID does not exist" Apr 16 15:22:28.218652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.218615 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" podStartSLOduration=1.218601857 podStartE2EDuration="1.218601857s" podCreationTimestamp="2026-04-16 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:22:28.217984297 +0000 UTC m=+641.657524851" watchObservedRunningTime="2026-04-16 15:22:28.218601857 +0000 UTC m=+641.658142420" Apr 16 15:22:28.219954 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:28.219930 2575 status_manager.go:895] "Failed to get status for pod" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9rll7" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9rll7\" is forbidden: User \"system:node:ip-10-0-129-254.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-254.ec2.internal' and this object" Apr 16 15:22:29.152201 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.152166 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dad1ffb-fd48-4864-a018-ae81473d92ca" path="/var/lib/kubelet/pods/0dad1ffb-fd48-4864-a018-ae81473d92ca/volumes" Apr 16 15:22:29.152558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.152544 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b7fddf-7292-4eff-b971-0f425dc689d0" path="/var/lib/kubelet/pods/23b7fddf-7292-4eff-b971-0f425dc689d0/volumes" Apr 16 15:22:29.177949 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.177910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" event={"ID":"bfe3846a-6b49-41e7-93e5-7dd464eca9c3","Type":"ContainerStarted","Data":"85e406b39ac135806fe472957e6c120a9f498876a45b1e7100db0f78c0c48e5c"} Apr 16 15:22:29.178317 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.177952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" event={"ID":"bfe3846a-6b49-41e7-93e5-7dd464eca9c3","Type":"ContainerStarted","Data":"bbf9764a6cc09bcd5feba10f8eb25f727b93867a4b2bfc40461ae8406905eb04"} Apr 16 15:22:29.178317 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.178152 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:29.201787 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:29.201744 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" podStartSLOduration=2.201731761 podStartE2EDuration="2.201731761s" podCreationTimestamp="2026-04-16 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:22:29.198915406 +0000 UTC m=+642.638455957" watchObservedRunningTime="2026-04-16 15:22:29.201731761 +0000 UTC m=+642.641272321" Apr 16 15:22:39.181235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:39.181201 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-jmk6z" Apr 16 15:22:39.181661 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:39.181261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:40.184364 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.184332 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:22:40.243133 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.243099 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:40.243327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.243305 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" podUID="f8f39adb-80b4-4846-a5ee-c5effb63e096" containerName="manager" containerID="cri-o://cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f" gracePeriod=10 Apr 16 15:22:40.484469 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.484448 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:40.654947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.654915 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume\") pod \"f8f39adb-80b4-4846-a5ee-c5effb63e096\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " Apr 16 15:22:40.654947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.654952 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dklgb\" (UniqueName: \"kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb\") pod \"f8f39adb-80b4-4846-a5ee-c5effb63e096\" (UID: \"f8f39adb-80b4-4846-a5ee-c5effb63e096\") " Apr 16 15:22:40.655370 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.655349 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f8f39adb-80b4-4846-a5ee-c5effb63e096" (UID: "f8f39adb-80b4-4846-a5ee-c5effb63e096"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:22:40.656948 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.656926 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb" (OuterVolumeSpecName: "kube-api-access-dklgb") pod "f8f39adb-80b4-4846-a5ee-c5effb63e096" (UID: "f8f39adb-80b4-4846-a5ee-c5effb63e096"). InnerVolumeSpecName "kube-api-access-dklgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:22:40.756253 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.756197 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8f39adb-80b4-4846-a5ee-c5effb63e096-extensions-socket-volume\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:22:40.756253 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:40.756220 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dklgb\" (UniqueName: \"kubernetes.io/projected/f8f39adb-80b4-4846-a5ee-c5effb63e096-kube-api-access-dklgb\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:22:41.215804 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.215768 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f39adb-80b4-4846-a5ee-c5effb63e096" containerID="cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f" exitCode=0 Apr 16 15:22:41.216215 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.215842 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" Apr 16 15:22:41.216215 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.215852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" event={"ID":"f8f39adb-80b4-4846-a5ee-c5effb63e096","Type":"ContainerDied","Data":"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f"} Apr 16 15:22:41.216215 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.215897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx" event={"ID":"f8f39adb-80b4-4846-a5ee-c5effb63e096","Type":"ContainerDied","Data":"f478c137805bd719453276af656e868f5130e3d59569aceafcc7927950b6715c"} Apr 16 15:22:41.216215 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.215923 2575 scope.go:117] "RemoveContainer" containerID="cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f" Apr 16 15:22:41.224028 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.224016 2575 scope.go:117] "RemoveContainer" containerID="cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f" Apr 16 15:22:41.224293 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:22:41.224276 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f\": container with ID starting with cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f not found: ID does not exist" containerID="cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f" Apr 16 15:22:41.224354 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.224304 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f"} err="failed to get container status \"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f\": rpc error: code = NotFound desc = could not find container \"cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f\": container with ID starting with cb0571dc43781e117a43a0568e8faa9b0c5fe52afc9eae746c622c213f9e056f not found: ID does not exist" Apr 16 15:22:41.234963 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.234942 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:41.243662 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:41.243642 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cs4wx"] Apr 16 15:22:43.151950 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:43.151917 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f39adb-80b4-4846-a5ee-c5effb63e096" path="/var/lib/kubelet/pods/f8f39adb-80b4-4846-a5ee-c5effb63e096/volumes" Apr 16 15:22:56.557719 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.557687 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4"] Apr 16 15:22:56.558093 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.558018 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8f39adb-80b4-4846-a5ee-c5effb63e096" containerName="manager" Apr 16 15:22:56.558093 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.558028 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f39adb-80b4-4846-a5ee-c5effb63e096" containerName="manager" Apr 16 15:22:56.558093 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.558092 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8f39adb-80b4-4846-a5ee-c5effb63e096" containerName="manager" Apr 16 15:22:56.562589 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.562566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.565487 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.565462 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-2hgs2\"" Apr 16 15:22:56.574073 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.574012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4"] Apr 16 15:22:56.577178 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577278 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577278 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577278 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577442 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjzp\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-kube-api-access-fnjzp\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577442 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfff54cd-d68a-49be-9f06-9272513ee4e3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.577619 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.577527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678339 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjzp\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-kube-api-access-fnjzp\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.678750 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.678672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfff54cd-d68a-49be-9f06-9272513ee4e3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.679209 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.679182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.679343 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.679250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.679343 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.679271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.679343 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.679337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.679534 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.679450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfff54cd-d68a-49be-9f06-9272513ee4e3-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.680718 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.680692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.681007 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.680990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.686589 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.686567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjzp\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-kube-api-access-fnjzp\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.686682 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.686573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfff54cd-d68a-49be-9f06-9272513ee4e3-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2mhs4\" (UID: \"dfff54cd-d68a-49be-9f06-9272513ee4e3\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.876255 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.876176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:56.999282 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:56.999257 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4"] Apr 16 15:22:57.001953 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:22:57.001925 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfff54cd_d68a_49be_9f06_9272513ee4e3.slice/crio-8b4a0e252e3c0197b961635c7851f2b08acff52de53a43271381bda1ace9bf04 WatchSource:0}: Error finding container 8b4a0e252e3c0197b961635c7851f2b08acff52de53a43271381bda1ace9bf04: Status 404 returned error can't find the container with id 8b4a0e252e3c0197b961635c7851f2b08acff52de53a43271381bda1ace9bf04 Apr 16 15:22:57.004092 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.004053 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:22:57.004162 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.004113 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:22:57.004162 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.004139 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 15:22:57.270311 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.270278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" event={"ID":"dfff54cd-d68a-49be-9f06-9272513ee4e3","Type":"ContainerStarted","Data":"12ef4649ce26aff4bae7a209a7d9cdc7852f7d7ba8aec8c6b01e96c579f15cd0"} Apr 16 15:22:57.270311 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.270312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" event={"ID":"dfff54cd-d68a-49be-9f06-9272513ee4e3","Type":"ContainerStarted","Data":"8b4a0e252e3c0197b961635c7851f2b08acff52de53a43271381bda1ace9bf04"} Apr 16 15:22:57.292411 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.292228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" podStartSLOduration=1.2922100460000001 podStartE2EDuration="1.292210046s" podCreationTimestamp="2026-04-16 15:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:22:57.290752735 +0000 UTC m=+670.730293297" watchObservedRunningTime="2026-04-16 15:22:57.292210046 +0000 UTC m=+670.731750607" Apr 16 15:22:57.876506 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:57.876470 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:58.881093 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:58.881065 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:59.276138 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:59.276103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:22:59.277058 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:22:59.277038 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2mhs4" Apr 16 15:23:02.378016 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.377984 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:02.382254 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.382226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:02.385560 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.385538 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-fssxm\"" Apr 16 15:23:02.391532 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.391508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:02.427209 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.427155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ph4d\" (UniqueName: \"kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d\") pod \"authorino-f99f4b5cd-jd42w\" (UID: \"97942371-1155-4500-9bd9-113746baa91b\") " pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:02.528311 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.528278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ph4d\" (UniqueName: \"kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d\") pod \"authorino-f99f4b5cd-jd42w\" (UID: \"97942371-1155-4500-9bd9-113746baa91b\") " pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:02.536630 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.536605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ph4d\" (UniqueName: \"kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d\") pod \"authorino-f99f4b5cd-jd42w\" (UID: \"97942371-1155-4500-9bd9-113746baa91b\") " pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:02.694882 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.694856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:02.815938 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:02.815874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:02.818400 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:23:02.818371 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97942371_1155_4500_9bd9_113746baa91b.slice/crio-1c120383225dde895bee305c30480d557da1d0569054449ae4ab3184d7899395 WatchSource:0}: Error finding container 1c120383225dde895bee305c30480d557da1d0569054449ae4ab3184d7899395: Status 404 returned error can't find the container with id 1c120383225dde895bee305c30480d557da1d0569054449ae4ab3184d7899395 Apr 16 15:23:03.289556 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:03.289522 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" event={"ID":"97942371-1155-4500-9bd9-113746baa91b","Type":"ContainerStarted","Data":"1c120383225dde895bee305c30480d557da1d0569054449ae4ab3184d7899395"} Apr 16 15:23:06.307410 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:06.307375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" event={"ID":"97942371-1155-4500-9bd9-113746baa91b","Type":"ContainerStarted","Data":"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac"} Apr 16 15:23:06.323218 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:06.323171 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" podStartSLOduration=1.387963197 podStartE2EDuration="4.323157897s" podCreationTimestamp="2026-04-16 15:23:02 +0000 UTC" firstStartedPulling="2026-04-16 15:23:02.819668618 +0000 UTC m=+676.259209157" lastFinishedPulling="2026-04-16 15:23:05.75486332 +0000 UTC m=+679.194403857" observedRunningTime="2026-04-16 15:23:06.32238187 +0000 UTC m=+679.761922463" watchObservedRunningTime="2026-04-16 15:23:06.323157897 +0000 UTC m=+679.762698458" Apr 16 15:23:06.399648 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:06.399614 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:08.314006 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:08.313970 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" podUID="97942371-1155-4500-9bd9-113746baa91b" containerName="authorino" containerID="cri-o://b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac" gracePeriod=30 Apr 16 15:23:08.549119 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:08.549097 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:08.582474 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:08.582395 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ph4d\" (UniqueName: \"kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d\") pod \"97942371-1155-4500-9bd9-113746baa91b\" (UID: \"97942371-1155-4500-9bd9-113746baa91b\") " Apr 16 15:23:08.584542 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:08.584512 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d" (OuterVolumeSpecName: "kube-api-access-4ph4d") pod "97942371-1155-4500-9bd9-113746baa91b" (UID: "97942371-1155-4500-9bd9-113746baa91b"). InnerVolumeSpecName "kube-api-access-4ph4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:08.682985 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:08.682955 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ph4d\" (UniqueName: \"kubernetes.io/projected/97942371-1155-4500-9bd9-113746baa91b-kube-api-access-4ph4d\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:23:09.319007 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.318971 2575 generic.go:358] "Generic (PLEG): container finished" podID="97942371-1155-4500-9bd9-113746baa91b" containerID="b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac" exitCode=0 Apr 16 15:23:09.319481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.319031 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" Apr 16 15:23:09.319481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.319059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" event={"ID":"97942371-1155-4500-9bd9-113746baa91b","Type":"ContainerDied","Data":"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac"} Apr 16 15:23:09.319481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.319099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jd42w" event={"ID":"97942371-1155-4500-9bd9-113746baa91b","Type":"ContainerDied","Data":"1c120383225dde895bee305c30480d557da1d0569054449ae4ab3184d7899395"} Apr 16 15:23:09.319481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.319115 2575 scope.go:117] "RemoveContainer" containerID="b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac" Apr 16 15:23:09.327138 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.327120 2575 scope.go:117] "RemoveContainer" containerID="b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac" Apr 16 15:23:09.327377 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:23:09.327361 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac\": container with ID starting with b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac not found: ID does not exist" containerID="b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac" Apr 16 15:23:09.327436 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.327385 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac"} err="failed to get container status \"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac\": rpc error: code = NotFound desc = could not find container \"b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac\": container with ID starting with b1f8817ddc1114893cc2834c4d22bde2128c0f32d6623c9d699475ab98558aac not found: ID does not exist" Apr 16 15:23:09.335301 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.335281 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:09.339198 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:09.339179 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jd42w"] Apr 16 15:23:11.151674 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:11.151643 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97942371-1155-4500-9bd9-113746baa91b" path="/var/lib/kubelet/pods/97942371-1155-4500-9bd9-113746baa91b/volumes" Apr 16 15:23:31.290972 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.290940 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:31.291397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.291276 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97942371-1155-4500-9bd9-113746baa91b" containerName="authorino" Apr 16 15:23:31.291397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.291288 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="97942371-1155-4500-9bd9-113746baa91b" containerName="authorino" Apr 16 15:23:31.291397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.291342 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="97942371-1155-4500-9bd9-113746baa91b" containerName="authorino" Apr 16 15:23:31.298491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.298470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.301179 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.301154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 15:23:31.301355 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.301339 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 15:23:31.301461 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.301402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bdctg\"" Apr 16 15:23:31.308446 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.308404 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:31.314233 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.314212 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:23:31.317507 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.317491 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:31.319971 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.319953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-zzh56\"" Apr 16 15:23:31.332975 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.332953 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:23:31.369806 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.369787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkzd\" (UniqueName: \"kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.369904 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.369822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.471068 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.471038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g8gf\" (UniqueName: \"kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf\") pod \"maas-controller-bff744c7f-zf8qp\" (UID: \"a6d2fb44-6bc4-4634-be46-b7ffa11b840e\") " pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:31.471234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.471120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkzd\" (UniqueName: \"kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.471234 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.471166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.473662 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.473636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.481236 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.481215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkzd\" (UniqueName: \"kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd\") pod \"maas-api-7d68b85d6b-65d9z\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.572255 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.572178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g8gf\" (UniqueName: \"kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf\") pod \"maas-controller-bff744c7f-zf8qp\" (UID: \"a6d2fb44-6bc4-4634-be46-b7ffa11b840e\") " pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:31.583516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.583485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g8gf\" (UniqueName: \"kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf\") pod \"maas-controller-bff744c7f-zf8qp\" (UID: \"a6d2fb44-6bc4-4634-be46-b7ffa11b840e\") " pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:31.609473 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.609446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:31.627130 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.627107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:31.953472 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.953447 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:31.955648 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:23:31.955621 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8729cace_3c0c_491b_b62c_b38b7df4cb44.slice/crio-ff581d2efce33cef4f3cb6394a0bac531219d52dae496a7d5b75397a19bac3a5 WatchSource:0}: Error finding container ff581d2efce33cef4f3cb6394a0bac531219d52dae496a7d5b75397a19bac3a5: Status 404 returned error can't find the container with id ff581d2efce33cef4f3cb6394a0bac531219d52dae496a7d5b75397a19bac3a5 Apr 16 15:23:31.983405 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:23:31.983380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d2fb44_6bc4_4634_be46_b7ffa11b840e.slice/crio-2976273b36abd9acda6e45ae1a5256f29e452b7a6415ede80eed4eb807068279 WatchSource:0}: Error finding container 2976273b36abd9acda6e45ae1a5256f29e452b7a6415ede80eed4eb807068279: Status 404 returned error can't find the container with id 2976273b36abd9acda6e45ae1a5256f29e452b7a6415ede80eed4eb807068279 Apr 16 15:23:31.987248 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:31.987226 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:23:32.229675 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.229599 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-689b5cb958-t4tkl"] Apr 16 15:23:32.234696 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.234676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.241063 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.241038 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-689b5cb958-t4tkl"] Apr 16 15:23:32.381040 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.381004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57qc\" (UniqueName: \"kubernetes.io/projected/9568a9a5-9a0b-4604-bb9d-27103e414127-kube-api-access-b57qc\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.381407 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.381189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9568a9a5-9a0b-4604-bb9d-27103e414127-maas-api-tls\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.397138 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.397095 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-bff744c7f-zf8qp" event={"ID":"a6d2fb44-6bc4-4634-be46-b7ffa11b840e","Type":"ContainerStarted","Data":"2976273b36abd9acda6e45ae1a5256f29e452b7a6415ede80eed4eb807068279"} Apr 16 15:23:32.398346 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.398320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d68b85d6b-65d9z" event={"ID":"8729cace-3c0c-491b-b62c-b38b7df4cb44","Type":"ContainerStarted","Data":"ff581d2efce33cef4f3cb6394a0bac531219d52dae496a7d5b75397a19bac3a5"} Apr 16 15:23:32.479866 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.479782 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:23:32.481836 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.481804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b57qc\" (UniqueName: \"kubernetes.io/projected/9568a9a5-9a0b-4604-bb9d-27103e414127-kube-api-access-b57qc\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.481987 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.481904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9568a9a5-9a0b-4604-bb9d-27103e414127-maas-api-tls\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.484018 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.483626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.485115 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.485064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9568a9a5-9a0b-4604-bb9d-27103e414127-maas-api-tls\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.488862 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.488839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 15:23:32.489069 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.488839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-fssxm\"" Apr 16 15:23:32.491909 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.491886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:23:32.496396 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.496350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57qc\" (UniqueName: \"kubernetes.io/projected/9568a9a5-9a0b-4604-bb9d-27103e414127-kube-api-access-b57qc\") pod \"maas-api-689b5cb958-t4tkl\" (UID: \"9568a9a5-9a0b-4604-bb9d-27103e414127\") " pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.552609 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.552568 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:32.583397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.583109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.583397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.583247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8z5d\" (UniqueName: \"kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.685227 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.684646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.685227 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.684743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8z5d\" (UniqueName: \"kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.688980 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.688789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.696080 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.696056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8z5d\" (UniqueName: \"kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d\") pod \"authorino-5d5bd7956c-wzqk5\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:32.768698 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.761327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-689b5cb958-t4tkl"] Apr 16 15:23:32.771707 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:23:32.771665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9568a9a5_9a0b_4604_bb9d_27103e414127.slice/crio-9104f9ff8d7bc33ef7fbc5b9a66737955848df28799e7f28326e92c5d12a7a8f WatchSource:0}: Error finding container 9104f9ff8d7bc33ef7fbc5b9a66737955848df28799e7f28326e92c5d12a7a8f: Status 404 returned error can't find the container with id 9104f9ff8d7bc33ef7fbc5b9a66737955848df28799e7f28326e92c5d12a7a8f Apr 16 15:23:32.819221 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:32.819188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:23:33.122439 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:33.122356 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:23:33.415558 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:33.415321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-689b5cb958-t4tkl" event={"ID":"9568a9a5-9a0b-4604-bb9d-27103e414127","Type":"ContainerStarted","Data":"9104f9ff8d7bc33ef7fbc5b9a66737955848df28799e7f28326e92c5d12a7a8f"} Apr 16 15:23:33.424746 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:33.424707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" event={"ID":"609e3bf9-4998-4a77-bf7d-47690b9d146b","Type":"ContainerStarted","Data":"70de2f3178b203470038e8cb14830581f9a09886779f75e7dc856561dd950bf6"} Apr 16 15:23:34.430955 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:34.430907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" event={"ID":"609e3bf9-4998-4a77-bf7d-47690b9d146b","Type":"ContainerStarted","Data":"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d"} Apr 16 15:23:36.439449 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.439398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-bff744c7f-zf8qp" event={"ID":"a6d2fb44-6bc4-4634-be46-b7ffa11b840e","Type":"ContainerStarted","Data":"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15"} Apr 16 15:23:36.439913 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.439511 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:23:36.440934 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.440910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d68b85d6b-65d9z" event={"ID":"8729cace-3c0c-491b-b62c-b38b7df4cb44","Type":"ContainerStarted","Data":"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135"} Apr 16 15:23:36.441054 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.441024 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:36.442334 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.442313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-689b5cb958-t4tkl" event={"ID":"9568a9a5-9a0b-4604-bb9d-27103e414127","Type":"ContainerStarted","Data":"c5387f3fd6aa96e4d4eef590a7bc4b445d70230f5dc9db8a44a27a3767e20389"} Apr 16 15:23:36.442457 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.442353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:36.458512 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.458459 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-bff744c7f-zf8qp" podStartSLOduration=1.4629032149999999 podStartE2EDuration="5.458444774s" podCreationTimestamp="2026-04-16 15:23:31 +0000 UTC" firstStartedPulling="2026-04-16 15:23:31.984640348 +0000 UTC m=+705.424180886" lastFinishedPulling="2026-04-16 15:23:35.98018189 +0000 UTC m=+709.419722445" observedRunningTime="2026-04-16 15:23:36.457302902 +0000 UTC m=+709.896843463" watchObservedRunningTime="2026-04-16 15:23:36.458444774 +0000 UTC m=+709.897985332" Apr 16 15:23:36.459404 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.459373 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" podStartSLOduration=4.046650704 podStartE2EDuration="4.459366533s" podCreationTimestamp="2026-04-16 15:23:32 +0000 UTC" firstStartedPulling="2026-04-16 15:23:33.178265373 +0000 UTC m=+706.617805923" lastFinishedPulling="2026-04-16 15:23:33.590981212 +0000 UTC m=+707.030521752" observedRunningTime="2026-04-16 15:23:34.456737654 +0000 UTC m=+707.896278214" watchObservedRunningTime="2026-04-16 15:23:36.459366533 +0000 UTC m=+709.898907092" Apr 16 15:23:36.494282 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.494242 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7d68b85d6b-65d9z" podStartSLOduration=1.470964498 podStartE2EDuration="5.494229869s" podCreationTimestamp="2026-04-16 15:23:31 +0000 UTC" firstStartedPulling="2026-04-16 15:23:31.957203066 +0000 UTC m=+705.396743604" lastFinishedPulling="2026-04-16 15:23:35.980468432 +0000 UTC m=+709.420008975" observedRunningTime="2026-04-16 15:23:36.49259334 +0000 UTC m=+709.932133903" watchObservedRunningTime="2026-04-16 15:23:36.494229869 +0000 UTC m=+709.933770429" Apr 16 15:23:36.494453 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:36.494404 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-689b5cb958-t4tkl" podStartSLOduration=1.281311474 podStartE2EDuration="4.494399055s" podCreationTimestamp="2026-04-16 15:23:32 +0000 UTC" firstStartedPulling="2026-04-16 15:23:32.773731444 +0000 UTC m=+706.213271994" lastFinishedPulling="2026-04-16 15:23:35.986819025 +0000 UTC m=+709.426359575" observedRunningTime="2026-04-16 15:23:36.475763754 +0000 UTC m=+709.915304315" watchObservedRunningTime="2026-04-16 15:23:36.494399055 +0000 UTC m=+709.933939615" Apr 16 15:23:42.451378 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.451350 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-689b5cb958-t4tkl" Apr 16 15:23:42.451900 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.451880 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:42.513002 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.512969 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:42.513233 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.513176 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7d68b85d6b-65d9z" podUID="8729cace-3c0c-491b-b62c-b38b7df4cb44" containerName="maas-api" containerID="cri-o://19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135" gracePeriod=30 Apr 16 15:23:42.755459 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.755438 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:42.881159 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.881129 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls\") pod \"8729cace-3c0c-491b-b62c-b38b7df4cb44\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " Apr 16 15:23:42.881303 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.881191 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkzd\" (UniqueName: \"kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd\") pod \"8729cace-3c0c-491b-b62c-b38b7df4cb44\" (UID: \"8729cace-3c0c-491b-b62c-b38b7df4cb44\") " Apr 16 15:23:42.883354 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.883327 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd" (OuterVolumeSpecName: "kube-api-access-gqkzd") pod "8729cace-3c0c-491b-b62c-b38b7df4cb44" (UID: "8729cace-3c0c-491b-b62c-b38b7df4cb44"). InnerVolumeSpecName "kube-api-access-gqkzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:42.883474 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.883357 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "8729cace-3c0c-491b-b62c-b38b7df4cb44" (UID: "8729cace-3c0c-491b-b62c-b38b7df4cb44"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:42.982480 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.982455 2575 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8729cace-3c0c-491b-b62c-b38b7df4cb44-maas-api-tls\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:23:42.982480 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:42.982478 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqkzd\" (UniqueName: \"kubernetes.io/projected/8729cace-3c0c-491b-b62c-b38b7df4cb44-kube-api-access-gqkzd\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:23:43.466013 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.465983 2575 generic.go:358] "Generic (PLEG): container finished" podID="8729cace-3c0c-491b-b62c-b38b7df4cb44" containerID="19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135" exitCode=0 Apr 16 15:23:43.466387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.466047 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7d68b85d6b-65d9z" Apr 16 15:23:43.466387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.466058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d68b85d6b-65d9z" event={"ID":"8729cace-3c0c-491b-b62c-b38b7df4cb44","Type":"ContainerDied","Data":"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135"} Apr 16 15:23:43.466387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.466097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7d68b85d6b-65d9z" event={"ID":"8729cace-3c0c-491b-b62c-b38b7df4cb44","Type":"ContainerDied","Data":"ff581d2efce33cef4f3cb6394a0bac531219d52dae496a7d5b75397a19bac3a5"} Apr 16 15:23:43.466387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.466112 2575 scope.go:117] "RemoveContainer" containerID="19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135" Apr 16 15:23:43.474049 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.474031 2575 scope.go:117] "RemoveContainer" containerID="19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135" Apr 16 15:23:43.474288 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:23:43.474266 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135\": container with ID starting with 19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135 not found: ID does not exist" containerID="19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135" Apr 16 15:23:43.474350 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.474295 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135"} err="failed to get container status \"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135\": rpc error: code = NotFound desc = could not find container \"19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135\": container with ID starting with 19d6cb7dd8fd436a57cb8169f4e93da4287dcd7485167302a8c4593566c5b135 not found: ID does not exist" Apr 16 15:23:43.486251 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.486231 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:43.492226 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:43.492199 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7d68b85d6b-65d9z"] Apr 16 15:23:45.151363 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:45.151331 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8729cace-3c0c-491b-b62c-b38b7df4cb44" path="/var/lib/kubelet/pods/8729cace-3c0c-491b-b62c-b38b7df4cb44/volumes" Apr 16 15:23:47.451393 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:23:47.451364 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:24:01.613355 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:01.613326 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:24:01.613869 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:01.613576 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-bff744c7f-zf8qp" podUID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" containerName="manager" containerID="cri-o://f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15" gracePeriod=10 Apr 16 15:24:01.856377 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:01.856356 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:24:02.034428 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.034401 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g8gf\" (UniqueName: \"kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf\") pod \"a6d2fb44-6bc4-4634-be46-b7ffa11b840e\" (UID: \"a6d2fb44-6bc4-4634-be46-b7ffa11b840e\") " Apr 16 15:24:02.036349 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.036324 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf" (OuterVolumeSpecName: "kube-api-access-5g8gf") pod "a6d2fb44-6bc4-4634-be46-b7ffa11b840e" (UID: "a6d2fb44-6bc4-4634-be46-b7ffa11b840e"). InnerVolumeSpecName "kube-api-access-5g8gf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:24:02.135806 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.135779 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5g8gf\" (UniqueName: \"kubernetes.io/projected/a6d2fb44-6bc4-4634-be46-b7ffa11b840e-kube-api-access-5g8gf\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:24:02.535490 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.535457 2575 generic.go:358] "Generic (PLEG): container finished" podID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" containerID="f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15" exitCode=0 Apr 16 15:24:02.535652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.535519 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-bff744c7f-zf8qp" Apr 16 15:24:02.535652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.535513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-bff744c7f-zf8qp" event={"ID":"a6d2fb44-6bc4-4634-be46-b7ffa11b840e","Type":"ContainerDied","Data":"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15"} Apr 16 15:24:02.535652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.535624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-bff744c7f-zf8qp" event={"ID":"a6d2fb44-6bc4-4634-be46-b7ffa11b840e","Type":"ContainerDied","Data":"2976273b36abd9acda6e45ae1a5256f29e452b7a6415ede80eed4eb807068279"} Apr 16 15:24:02.535652 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.535641 2575 scope.go:117] "RemoveContainer" containerID="f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15" Apr 16 15:24:02.544177 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.544161 2575 scope.go:117] "RemoveContainer" containerID="f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15" Apr 16 15:24:02.544412 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:24:02.544396 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15\": container with ID starting with f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15 not found: ID does not exist" containerID="f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15" Apr 16 15:24:02.544494 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.544434 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15"} err="failed to get container status \"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15\": rpc error: code = NotFound desc = could not find container \"f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15\": container with ID starting with f95aac8443c7af0b658f9190c7c3dc987b189149dbb766fbd7bfd19aabb4ff15 not found: ID does not exist" Apr 16 15:24:02.572520 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.572498 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:24:02.576739 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:02.576716 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-bff744c7f-zf8qp"] Apr 16 15:24:03.151226 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:03.151191 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" path="/var/lib/kubelet/pods/a6d2fb44-6bc4-4634-be46-b7ffa11b840e/volumes" Apr 16 15:24:16.783491 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.783454 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5"] Apr 16 15:24:16.784012 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.783994 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8729cace-3c0c-491b-b62c-b38b7df4cb44" containerName="maas-api" Apr 16 15:24:16.784056 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.784018 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8729cace-3c0c-491b-b62c-b38b7df4cb44" containerName="maas-api" Apr 16 15:24:16.784056 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.784037 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" containerName="manager" Apr 16 15:24:16.784056 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.784047 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" containerName="manager" Apr 16 15:24:16.784150 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.784145 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8729cace-3c0c-491b-b62c-b38b7df4cb44" containerName="maas-api" Apr 16 15:24:16.784187 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.784154 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6d2fb44-6bc4-4634-be46-b7ffa11b840e" containerName="manager" Apr 16 15:24:16.787621 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.787595 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.790214 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.790190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 15:24:16.791363 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.791339 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 15:24:16.791534 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.791344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wnsmp\"" Apr 16 15:24:16.791534 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.791345 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 15:24:16.794937 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.794911 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5"] Apr 16 15:24:16.859823 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.859790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj4m\" (UniqueName: \"kubernetes.io/projected/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kube-api-access-jgj4m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.859989 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.859877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.859989 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.859924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.860112 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.859978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2f6d3-b4e8-481e-a698-3a016e652b71-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.860112 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.860053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.860112 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.860089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961045 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2f6d3-b4e8-481e-a698-3a016e652b71-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961229 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961483 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj4m\" (UniqueName: \"kubernetes.io/projected/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kube-api-access-jgj4m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961625 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.961965 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.961941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.962119 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.962038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.963975 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.963947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cc2f6d3-b4e8-481e-a698-3a016e652b71-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.964250 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.964231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2f6d3-b4e8-481e-a698-3a016e652b71-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:16.969186 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:16.969166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj4m\" (UniqueName: \"kubernetes.io/projected/0cc2f6d3-b4e8-481e-a698-3a016e652b71-kube-api-access-jgj4m\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-wwhj5\" (UID: \"0cc2f6d3-b4e8-481e-a698-3a016e652b71\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:17.100741 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:17.100661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:17.232018 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:17.231954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5"] Apr 16 15:24:17.234528 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:24:17.234498 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cc2f6d3_b4e8_481e_a698_3a016e652b71.slice/crio-e33454a55926a4e2b1526f96a3b7ff3eeff1441ab972dd7dddfe35b0868414fd WatchSource:0}: Error finding container e33454a55926a4e2b1526f96a3b7ff3eeff1441ab972dd7dddfe35b0868414fd: Status 404 returned error can't find the container with id e33454a55926a4e2b1526f96a3b7ff3eeff1441ab972dd7dddfe35b0868414fd Apr 16 15:24:17.593897 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:17.593856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" event={"ID":"0cc2f6d3-b4e8-481e-a698-3a016e652b71","Type":"ContainerStarted","Data":"e33454a55926a4e2b1526f96a3b7ff3eeff1441ab972dd7dddfe35b0868414fd"} Apr 16 15:24:22.616303 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:22.616265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" event={"ID":"0cc2f6d3-b4e8-481e-a698-3a016e652b71","Type":"ContainerStarted","Data":"1506f20f1f90db06bab04151a2a988f8d540ac8765cfee38ddc50d31bcec849f"} Apr 16 15:24:27.637442 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:27.637386 2575 generic.go:358] "Generic (PLEG): container finished" podID="0cc2f6d3-b4e8-481e-a698-3a016e652b71" containerID="1506f20f1f90db06bab04151a2a988f8d540ac8765cfee38ddc50d31bcec849f" exitCode=0 Apr 16 15:24:27.637876 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:27.637465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" event={"ID":"0cc2f6d3-b4e8-481e-a698-3a016e652b71","Type":"ContainerDied","Data":"1506f20f1f90db06bab04151a2a988f8d540ac8765cfee38ddc50d31bcec849f"} Apr 16 15:24:31.655721 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:31.655680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" event={"ID":"0cc2f6d3-b4e8-481e-a698-3a016e652b71","Type":"ContainerStarted","Data":"af6d6f1171333d0dca362141f702ef2169cf73ab3c931edb23d3e9ce58aed7d0"} Apr 16 15:24:31.656113 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:31.655893 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:24:31.675590 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:31.675543 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" podStartSLOduration=1.483971851 podStartE2EDuration="15.675526077s" podCreationTimestamp="2026-04-16 15:24:16 +0000 UTC" firstStartedPulling="2026-04-16 15:24:17.23618359 +0000 UTC m=+750.675724132" lastFinishedPulling="2026-04-16 15:24:31.427737816 +0000 UTC m=+764.867278358" observedRunningTime="2026-04-16 15:24:31.674744707 +0000 UTC m=+765.114285266" watchObservedRunningTime="2026-04-16 15:24:31.675526077 +0000 UTC m=+765.115066636" Apr 16 15:24:42.672926 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:24:42.672895 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-wwhj5" Apr 16 15:25:48.075054 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.075018 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:25:48.075557 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.075215 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" podUID="609e3bf9-4998-4a77-bf7d-47690b9d146b" containerName="authorino" containerID="cri-o://dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d" gracePeriod=30 Apr 16 15:25:48.319336 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.319314 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:25:48.435362 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.435342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8z5d\" (UniqueName: \"kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d\") pod \"609e3bf9-4998-4a77-bf7d-47690b9d146b\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " Apr 16 15:25:48.435523 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.435373 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert\") pod \"609e3bf9-4998-4a77-bf7d-47690b9d146b\" (UID: \"609e3bf9-4998-4a77-bf7d-47690b9d146b\") " Apr 16 15:25:48.437476 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.437443 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d" (OuterVolumeSpecName: "kube-api-access-t8z5d") pod "609e3bf9-4998-4a77-bf7d-47690b9d146b" (UID: "609e3bf9-4998-4a77-bf7d-47690b9d146b"). InnerVolumeSpecName "kube-api-access-t8z5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:25:48.445065 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.445038 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "609e3bf9-4998-4a77-bf7d-47690b9d146b" (UID: "609e3bf9-4998-4a77-bf7d-47690b9d146b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:25:48.536778 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.536750 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t8z5d\" (UniqueName: \"kubernetes.io/projected/609e3bf9-4998-4a77-bf7d-47690b9d146b-kube-api-access-t8z5d\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:25:48.536880 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.536781 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/609e3bf9-4998-4a77-bf7d-47690b9d146b-tls-cert\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:25:48.931787 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.931697 2575 generic.go:358] "Generic (PLEG): container finished" podID="609e3bf9-4998-4a77-bf7d-47690b9d146b" containerID="dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d" exitCode=0 Apr 16 15:25:48.931787 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.931748 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" Apr 16 15:25:48.931988 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.931781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" event={"ID":"609e3bf9-4998-4a77-bf7d-47690b9d146b","Type":"ContainerDied","Data":"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d"} Apr 16 15:25:48.931988 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.931819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d5bd7956c-wzqk5" event={"ID":"609e3bf9-4998-4a77-bf7d-47690b9d146b","Type":"ContainerDied","Data":"70de2f3178b203470038e8cb14830581f9a09886779f75e7dc856561dd950bf6"} Apr 16 15:25:48.931988 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.931836 2575 scope.go:117] "RemoveContainer" containerID="dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d" Apr 16 15:25:48.941374 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.941352 2575 scope.go:117] "RemoveContainer" containerID="dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d" Apr 16 15:25:48.941640 ip-10-0-129-254 kubenswrapper[2575]: E0416 15:25:48.941623 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d\": container with ID starting with dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d not found: ID does not exist" containerID="dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d" Apr 16 15:25:48.941716 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.941653 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d"} err="failed to get container status \"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d\": rpc error: code = NotFound desc = could not find container \"dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d\": container with ID starting with dde8b26f36bc6bd78189a0892a979c1e00f5a8ef1c41b916e1da7fd70a83024d not found: ID does not exist" Apr 16 15:25:48.959894 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.959871 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:25:48.965005 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:48.964986 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5d5bd7956c-wzqk5"] Apr 16 15:25:49.150976 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:25:49.150947 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609e3bf9-4998-4a77-bf7d-47690b9d146b" path="/var/lib/kubelet/pods/609e3bf9-4998-4a77-bf7d-47690b9d146b/volumes" Apr 16 15:26:47.088454 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:26:47.088357 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:26:47.089638 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:26:47.089616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:27:13.127757 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.127724 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-6r8jr"] Apr 16 15:27:13.128181 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.128106 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="609e3bf9-4998-4a77-bf7d-47690b9d146b" containerName="authorino" Apr 16 15:27:13.128181 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.128119 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="609e3bf9-4998-4a77-bf7d-47690b9d146b" containerName="authorino" Apr 16 15:27:13.128181 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.128180 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="609e3bf9-4998-4a77-bf7d-47690b9d146b" containerName="authorino" Apr 16 15:27:13.131160 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.131140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:13.133775 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.133753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-zzh56\"" Apr 16 15:27:13.139805 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.139782 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-6r8jr"] Apr 16 15:27:13.246000 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.245968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgrdx\" (UniqueName: \"kubernetes.io/projected/ece567a0-c607-4b75-9d5f-dd49b1eac224-kube-api-access-xgrdx\") pod \"maas-controller-68fcd65c4f-6r8jr\" (UID: \"ece567a0-c607-4b75-9d5f-dd49b1eac224\") " pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:13.347012 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.346979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrdx\" (UniqueName: \"kubernetes.io/projected/ece567a0-c607-4b75-9d5f-dd49b1eac224-kube-api-access-xgrdx\") pod \"maas-controller-68fcd65c4f-6r8jr\" (UID: \"ece567a0-c607-4b75-9d5f-dd49b1eac224\") " pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:13.355786 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.355760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgrdx\" (UniqueName: \"kubernetes.io/projected/ece567a0-c607-4b75-9d5f-dd49b1eac224-kube-api-access-xgrdx\") pod \"maas-controller-68fcd65c4f-6r8jr\" (UID: \"ece567a0-c607-4b75-9d5f-dd49b1eac224\") " pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:13.441596 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.441574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:13.564791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.564766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68fcd65c4f-6r8jr"] Apr 16 15:27:13.566950 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:27:13.566922 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podece567a0_c607_4b75_9d5f_dd49b1eac224.slice/crio-ca31b66ae06ad4639d9c17dea7bdecaf2850be14e463025b027e55d50102cba8 WatchSource:0}: Error finding container ca31b66ae06ad4639d9c17dea7bdecaf2850be14e463025b027e55d50102cba8: Status 404 returned error can't find the container with id ca31b66ae06ad4639d9c17dea7bdecaf2850be14e463025b027e55d50102cba8 Apr 16 15:27:13.568100 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:13.568080 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:27:14.221440 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:14.221384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" event={"ID":"ece567a0-c607-4b75-9d5f-dd49b1eac224","Type":"ContainerStarted","Data":"5d5e017cafa514c21ed2e39dfac081677fa934d437fec8c271fea6822de28738"} Apr 16 15:27:14.221798 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:14.221450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" event={"ID":"ece567a0-c607-4b75-9d5f-dd49b1eac224","Type":"ContainerStarted","Data":"ca31b66ae06ad4639d9c17dea7bdecaf2850be14e463025b027e55d50102cba8"} Apr 16 15:27:14.221798 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:14.221475 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:27:14.240642 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:14.240554 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" podStartSLOduration=0.829687599 podStartE2EDuration="1.240540779s" podCreationTimestamp="2026-04-16 15:27:13 +0000 UTC" firstStartedPulling="2026-04-16 15:27:13.568238702 +0000 UTC m=+927.007779241" lastFinishedPulling="2026-04-16 15:27:13.979091879 +0000 UTC m=+927.418632421" observedRunningTime="2026-04-16 15:27:14.239447717 +0000 UTC m=+927.678988278" watchObservedRunningTime="2026-04-16 15:27:14.240540779 +0000 UTC m=+927.680081338" Apr 16 15:27:25.230236 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:27:25.230203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-68fcd65c4f-6r8jr" Apr 16 15:31:47.114693 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:31:47.114662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:31:47.116374 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:31:47.116350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:36:47.140171 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:36:47.140144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:36:47.142521 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:36:47.142497 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:37:22.386831 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:22.386802 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:37:22.387331 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:22.387040 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" podUID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" containerName="manager" containerID="cri-o://85e406b39ac135806fe472957e6c120a9f498876a45b1e7100db0f78c0c48e5c" gracePeriod=10 Apr 16 15:37:23.307934 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.307905 2575 generic.go:358] "Generic (PLEG): container finished" podID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" containerID="85e406b39ac135806fe472957e6c120a9f498876a45b1e7100db0f78c0c48e5c" exitCode=0 Apr 16 15:37:23.308066 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.307975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" event={"ID":"bfe3846a-6b49-41e7-93e5-7dd464eca9c3","Type":"ContainerDied","Data":"85e406b39ac135806fe472957e6c120a9f498876a45b1e7100db0f78c0c48e5c"} Apr 16 15:37:23.333760 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.333740 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:37:23.518868 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.518837 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume\") pod \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " Apr 16 15:37:23.519222 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.518891 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgknq\" (UniqueName: \"kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq\") pod \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\" (UID: \"bfe3846a-6b49-41e7-93e5-7dd464eca9c3\") " Apr 16 15:37:23.519222 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.519184 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "bfe3846a-6b49-41e7-93e5-7dd464eca9c3" (UID: "bfe3846a-6b49-41e7-93e5-7dd464eca9c3"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:37:23.520789 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.520762 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq" (OuterVolumeSpecName: "kube-api-access-hgknq") pod "bfe3846a-6b49-41e7-93e5-7dd464eca9c3" (UID: "bfe3846a-6b49-41e7-93e5-7dd464eca9c3"). InnerVolumeSpecName "kube-api-access-hgknq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:37:23.620347 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.620323 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-extensions-socket-volume\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:37:23.620347 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:23.620342 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgknq\" (UniqueName: \"kubernetes.io/projected/bfe3846a-6b49-41e7-93e5-7dd464eca9c3-kube-api-access-hgknq\") on node \"ip-10-0-129-254.ec2.internal\" DevicePath \"\"" Apr 16 15:37:24.312219 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:24.312183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" event={"ID":"bfe3846a-6b49-41e7-93e5-7dd464eca9c3","Type":"ContainerDied","Data":"bbf9764a6cc09bcd5feba10f8eb25f727b93867a4b2bfc40461ae8406905eb04"} Apr 16 15:37:24.312397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:24.312227 2575 scope.go:117] "RemoveContainer" containerID="85e406b39ac135806fe472957e6c120a9f498876a45b1e7100db0f78c0c48e5c" Apr 16 15:37:24.312397 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:24.312195 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk" Apr 16 15:37:24.332939 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:24.332917 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:37:24.336647 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:24.336621 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-g8vhk"] Apr 16 15:37:25.151264 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:37:25.151236 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" path="/var/lib/kubelet/pods/bfe3846a-6b49-41e7-93e5-7dd464eca9c3/volumes" Apr 16 15:38:28.718995 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.718961 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf"] Apr 16 15:38:28.719466 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.719315 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" containerName="manager" Apr 16 15:38:28.719466 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.719325 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" containerName="manager" Apr 16 15:38:28.719466 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.719381 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfe3846a-6b49-41e7-93e5-7dd464eca9c3" containerName="manager" Apr 16 15:38:28.722468 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.722452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.725157 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.725131 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v7gzl\"" Apr 16 15:38:28.732139 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.732117 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf"] Apr 16 15:38:28.841989 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.841958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbjf6\" (UniqueName: \"kubernetes.io/projected/e372bda5-48aa-4fe9-83b8-70c44af9d43d-kube-api-access-kbjf6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.842147 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.842010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e372bda5-48aa-4fe9-83b8-70c44af9d43d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.942498 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.942469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbjf6\" (UniqueName: \"kubernetes.io/projected/e372bda5-48aa-4fe9-83b8-70c44af9d43d-kube-api-access-kbjf6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.942618 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.942513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e372bda5-48aa-4fe9-83b8-70c44af9d43d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.942867 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.942842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e372bda5-48aa-4fe9-83b8-70c44af9d43d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:28.950822 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:28.950799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbjf6\" (UniqueName: \"kubernetes.io/projected/e372bda5-48aa-4fe9-83b8-70c44af9d43d-kube-api-access-kbjf6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf\" (UID: \"e372bda5-48aa-4fe9-83b8-70c44af9d43d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:29.033516 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.033457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:29.154919 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.154886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf"] Apr 16 15:38:29.158103 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:38:29.158076 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode372bda5_48aa_4fe9_83b8_70c44af9d43d.slice/crio-e17590f571094b6cac76f74858a49bed994d1b082337488f61a446c39d668bb6 WatchSource:0}: Error finding container e17590f571094b6cac76f74858a49bed994d1b082337488f61a446c39d668bb6: Status 404 returned error can't find the container with id e17590f571094b6cac76f74858a49bed994d1b082337488f61a446c39d668bb6 Apr 16 15:38:29.160315 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.160300 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:38:29.528780 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.528748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" event={"ID":"e372bda5-48aa-4fe9-83b8-70c44af9d43d","Type":"ContainerStarted","Data":"d7a81b1ba9d72af270281241c5e5d7d56efa8962daab2037a59b21f967961bdd"} Apr 16 15:38:29.528780 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.528782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" event={"ID":"e372bda5-48aa-4fe9-83b8-70c44af9d43d","Type":"ContainerStarted","Data":"e17590f571094b6cac76f74858a49bed994d1b082337488f61a446c39d668bb6"} Apr 16 15:38:29.528994 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.528807 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:38:29.548405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:29.548350 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" podStartSLOduration=1.5483368259999999 podStartE2EDuration="1.548336826s" podCreationTimestamp="2026-04-16 15:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:38:29.546026061 +0000 UTC m=+1602.985566632" watchObservedRunningTime="2026-04-16 15:38:29.548336826 +0000 UTC m=+1602.987877385" Apr 16 15:38:40.534319 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:38:40.534248 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf" Apr 16 15:41:47.167485 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:41:47.167458 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:41:47.169173 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:41:47.169152 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:46:47.192814 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:46:47.192784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:46:47.195376 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:46:47.194562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:48:13.754051 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:13.753973 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-689b5cb958-t4tkl_9568a9a5-9a0b-4604-bb9d-27103e414127/maas-api/0.log" Apr 16 15:48:13.867776 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:13.867744 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-68fcd65c4f-6r8jr_ece567a0-c607-4b75-9d5f-dd49b1eac224/manager/0.log" Apr 16 15:48:14.226048 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:14.226020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-khzn6_a0decef0-2289-44d3-b69e-9006bab3f5ee/manager/0.log" Apr 16 15:48:15.990845 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:15.990813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-t5d2r_6f81a8e1-37e0-42a5-9fa8-be64e757d67c/manager/0.log" Apr 16 15:48:16.219082 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:16.219046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-8hsds_081a4915-16f4-4cb3-9eb4-9ac457748dcc/registry-server/0.log" Apr 16 15:48:16.334484 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:16.334394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf_e372bda5-48aa-4fe9-83b8-70c44af9d43d/manager/0.log" Apr 16 15:48:16.559176 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:16.559149 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-jmk6z_d3e39f88-26ca-46bd-85e2-ab2cfe69e539/manager/0.log" Apr 16 15:48:16.901387 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:16.901364 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg_045c7f45-5e91-4566-9224-a16f19f43f4d/istio-proxy/0.log" Apr 16 15:48:17.360892 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:17.360852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-2mhs4_dfff54cd-d68a-49be-9f06-9272513ee4e3/istio-proxy/0.log" Apr 16 15:48:18.046066 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:18.046038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-wwhj5_0cc2f6d3-b4e8-481e-a698-3a016e652b71/storage-initializer/0.log" Apr 16 15:48:18.052713 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:18.052690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-wwhj5_0cc2f6d3-b4e8-481e-a698-3a016e652b71/main/0.log" Apr 16 15:48:22.027753 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.027721 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t57q5/must-gather-jwx5h"] Apr 16 15:48:22.031293 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.031277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.033821 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.033793 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t57q5\"/\"default-dockercfg-88qxm\"" Apr 16 15:48:22.033947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.033793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"openshift-service-ca.crt\"" Apr 16 15:48:22.033947 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.033891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t57q5\"/\"kube-root-ca.crt\"" Apr 16 15:48:22.048327 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.048303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/must-gather-jwx5h"] Apr 16 15:48:22.056405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.056385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-must-gather-output\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.056531 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.056469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nsx\" (UniqueName: \"kubernetes.io/projected/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-kube-api-access-g5nsx\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.157311 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.157281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nsx\" (UniqueName: \"kubernetes.io/projected/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-kube-api-access-g5nsx\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.157499 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.157365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-must-gather-output\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.157704 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.157687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-must-gather-output\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.164488 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.164469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nsx\" (UniqueName: \"kubernetes.io/projected/2367abe0-7e47-47fd-a26a-de4d4a82d1a5-kube-api-access-g5nsx\") pod \"must-gather-jwx5h\" (UID: \"2367abe0-7e47-47fd-a26a-de4d4a82d1a5\") " pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.340432 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.340354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/must-gather-jwx5h" Apr 16 15:48:22.464235 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.464212 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/must-gather-jwx5h"] Apr 16 15:48:22.466911 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:48:22.466877 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2367abe0_7e47_47fd_a26a_de4d4a82d1a5.slice/crio-24179db6c825ff355c665b1a088e975d5e1e31fecf41fbc1599716468ebae0f7 WatchSource:0}: Error finding container 24179db6c825ff355c665b1a088e975d5e1e31fecf41fbc1599716468ebae0f7: Status 404 returned error can't find the container with id 24179db6c825ff355c665b1a088e975d5e1e31fecf41fbc1599716468ebae0f7 Apr 16 15:48:22.468695 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.468675 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:48:22.548360 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:22.548331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/must-gather-jwx5h" event={"ID":"2367abe0-7e47-47fd-a26a-de4d4a82d1a5","Type":"ContainerStarted","Data":"24179db6c825ff355c665b1a088e975d5e1e31fecf41fbc1599716468ebae0f7"} Apr 16 15:48:23.553888 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:23.553857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/must-gather-jwx5h" event={"ID":"2367abe0-7e47-47fd-a26a-de4d4a82d1a5","Type":"ContainerStarted","Data":"45a4bc175c0dc82267f3a638ccdf97b528e3f5210cf4ffbc2f7d4fd2f0a162ce"} Apr 16 15:48:23.553888 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:23.553894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/must-gather-jwx5h" event={"ID":"2367abe0-7e47-47fd-a26a-de4d4a82d1a5","Type":"ContainerStarted","Data":"c8691dfa820fb2596d3f738c2a0a1d3ca5c7356ecded1a734a25c0f7c22d0d7f"} Apr 16 15:48:23.580525 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:23.580461 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t57q5/must-gather-jwx5h" podStartSLOduration=0.782040134 podStartE2EDuration="1.580443706s" podCreationTimestamp="2026-04-16 15:48:22 +0000 UTC" firstStartedPulling="2026-04-16 15:48:22.468839045 +0000 UTC m=+2195.908379587" lastFinishedPulling="2026-04-16 15:48:23.267242608 +0000 UTC m=+2196.706783159" observedRunningTime="2026-04-16 15:48:23.570602767 +0000 UTC m=+2197.010143328" watchObservedRunningTime="2026-04-16 15:48:23.580443706 +0000 UTC m=+2197.019984268" Apr 16 15:48:24.786677 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:24.786626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ghhxd_6fac2453-74e6-4f70-8221-26f46efaa1a5/global-pull-secret-syncer/0.log" Apr 16 15:48:24.973667 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:24.973616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-npqcs_2cca38e9-1e0a-4179-87b0-74cf0d052206/konnectivity-agent/0.log" Apr 16 15:48:24.997120 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:24.997077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-254.ec2.internal_840c483ab6972a395b52b57432ebf0a1/haproxy/0.log" Apr 16 15:48:29.187369 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:29.187330 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-t5d2r_6f81a8e1-37e0-42a5-9fa8-be64e757d67c/manager/0.log" Apr 16 15:48:29.255295 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:29.255258 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-8hsds_081a4915-16f4-4cb3-9eb4-9ac457748dcc/registry-server/0.log" Apr 16 15:48:29.332775 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:29.332686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-jhjkf_e372bda5-48aa-4fe9-83b8-70c44af9d43d/manager/0.log" Apr 16 15:48:29.463792 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:29.463733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-jmk6z_d3e39f88-26ca-46bd-85e2-ab2cfe69e539/manager/0.log" Apr 16 15:48:31.085614 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.085540 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-jdrgf_68dfc5e0-5a22-4b10-9af8-2235bd3e6c1e/cluster-monitoring-operator/0.log" Apr 16 15:48:31.113111 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.113085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6d72x_67d34f25-682d-44ce-b253-d768fba43c67/kube-state-metrics/0.log" Apr 16 15:48:31.135259 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.135217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6d72x_67d34f25-682d-44ce-b253-d768fba43c67/kube-rbac-proxy-main/0.log" Apr 16 15:48:31.158212 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.158188 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6d72x_67d34f25-682d-44ce-b253-d768fba43c67/kube-rbac-proxy-self/0.log" Apr 16 15:48:31.212362 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.212292 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-85hsg_baaa7282-7e35-4700-988e-58aef4e5d8f4/monitoring-plugin/0.log" Apr 16 15:48:31.247594 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.247562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmgkg_e24a16f5-1dcd-41d0-8b97-b13b1ecdb278/node-exporter/0.log" Apr 16 15:48:31.273910 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.273884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmgkg_e24a16f5-1dcd-41d0-8b97-b13b1ecdb278/kube-rbac-proxy/0.log" Apr 16 15:48:31.301171 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.301139 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmgkg_e24a16f5-1dcd-41d0-8b97-b13b1ecdb278/init-textfile/0.log" Apr 16 15:48:31.467681 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.467648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2kklt_808117b5-475a-4911-9e51-8d3a34537662/kube-rbac-proxy-main/0.log" Apr 16 15:48:31.493662 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.493632 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2kklt_808117b5-475a-4911-9e51-8d3a34537662/kube-rbac-proxy-self/0.log" Apr 16 15:48:31.515069 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.515037 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2kklt_808117b5-475a-4911-9e51-8d3a34537662/openshift-state-metrics/0.log" Apr 16 15:48:31.559951 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.559922 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/prometheus/0.log" Apr 16 15:48:31.595013 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.594989 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/config-reloader/0.log" Apr 16 15:48:31.619433 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.619391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/thanos-sidecar/0.log" Apr 16 15:48:31.641161 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.641118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/kube-rbac-proxy-web/0.log" Apr 16 15:48:31.667576 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.667544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/kube-rbac-proxy/0.log" Apr 16 15:48:31.691643 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.691611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/kube-rbac-proxy-thanos/0.log" Apr 16 15:48:31.716143 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:31.716116 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f0e55f10-0500-48e1-b262-7e6164b733bd/init-config-reloader/0.log" Apr 16 15:48:33.411406 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.411368 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn"] Apr 16 15:48:33.417727 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.417704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.428219 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.428190 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn"] Apr 16 15:48:33.468547 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.468492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-podres\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.468730 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.468606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7z2z\" (UniqueName: \"kubernetes.io/projected/52e01f22-dc4f-426a-810c-bb032efb4d2c-kube-api-access-s7z2z\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.468730 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.468686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-proc\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.468730 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.468712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-sys\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.468909 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.468788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-lib-modules\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569602 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7z2z\" (UniqueName: \"kubernetes.io/projected/52e01f22-dc4f-426a-810c-bb032efb4d2c-kube-api-access-s7z2z\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-proc\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-sys\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-lib-modules\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-proc\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.569791 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-podres\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.570022 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-sys\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.570022 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-podres\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.570022 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.569948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/52e01f22-dc4f-426a-810c-bb032efb4d2c-lib-modules\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.579050 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.579017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7z2z\" (UniqueName: \"kubernetes.io/projected/52e01f22-dc4f-426a-810c-bb032efb4d2c-kube-api-access-s7z2z\") pod \"perf-node-gather-daemonset-2rssn\" (UID: \"52e01f22-dc4f-426a-810c-bb032efb4d2c\") " pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.732148 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.732118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:33.900295 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:33.900232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn"] Apr 16 15:48:33.904637 ip-10-0-129-254 kubenswrapper[2575]: W0416 15:48:33.904611 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod52e01f22_dc4f_426a_810c_bb032efb4d2c.slice/crio-7b035b45f1cc190eed93987292324376b24ff91c74815e475036fb1b7e841541 WatchSource:0}: Error finding container 7b035b45f1cc190eed93987292324376b24ff91c74815e475036fb1b7e841541: Status 404 returned error can't find the container with id 7b035b45f1cc190eed93987292324376b24ff91c74815e475036fb1b7e841541 Apr 16 15:48:34.609749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:34.609622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" event={"ID":"52e01f22-dc4f-426a-810c-bb032efb4d2c","Type":"ContainerStarted","Data":"b7b94bfdd3667f45724218f8fb00e70c9031c03701ff9bae66675f05c9ff660a"} Apr 16 15:48:34.609749 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:34.609668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" event={"ID":"52e01f22-dc4f-426a-810c-bb032efb4d2c","Type":"ContainerStarted","Data":"7b035b45f1cc190eed93987292324376b24ff91c74815e475036fb1b7e841541"} Apr 16 15:48:34.610232 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:34.609860 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:34.623886 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:34.623821 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" podStartSLOduration=1.623802585 podStartE2EDuration="1.623802585s" podCreationTimestamp="2026-04-16 15:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:48:34.623019693 +0000 UTC m=+2208.062560253" watchObservedRunningTime="2026-04-16 15:48:34.623802585 +0000 UTC m=+2208.063343147" Apr 16 15:48:35.672314 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:35.672285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pm67f_29f4a0db-de17-476d-97ad-df37fd2a5065/dns/0.log" Apr 16 15:48:35.692041 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:35.692019 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pm67f_29f4a0db-de17-476d-97ad-df37fd2a5065/kube-rbac-proxy/0.log" Apr 16 15:48:35.738464 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:35.738439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kjtwg_74d85d78-e8d8-4b5c-a950-f65047122164/dns-node-resolver/0.log" Apr 16 15:48:36.212624 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:36.212588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bd44bc866-ll72x_f04e3364-1f42-4f56-83fb-a1a55799b48e/registry/0.log" Apr 16 15:48:36.276992 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:36.276971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-thxgc_b6b5389a-1cfb-46bd-bdee-65b24755f000/node-ca/0.log" Apr 16 15:48:37.049577 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:37.049553 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdp2vg_045c7f45-5e91-4566-9224-a16f19f43f4d/istio-proxy/0.log" Apr 16 15:48:37.317536 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:37.317461 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-2mhs4_dfff54cd-d68a-49be-9f06-9272513ee4e3/istio-proxy/0.log" Apr 16 15:48:37.852481 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:37.852453 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jrg65_d01959bc-3d04-456b-9dbe-ea153e10fa05/serve-healthcheck-canary/0.log" Apr 16 15:48:38.386205 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:38.386177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-558cz_b24b7c4b-6257-4d81-8ce8-c2579532c74c/kube-rbac-proxy/0.log" Apr 16 15:48:38.407279 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:38.407250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-558cz_b24b7c4b-6257-4d81-8ce8-c2579532c74c/exporter/0.log" Apr 16 15:48:38.427488 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:38.427464 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-558cz_b24b7c4b-6257-4d81-8ce8-c2579532c74c/extractor/0.log" Apr 16 15:48:40.339088 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:40.339061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-689b5cb958-t4tkl_9568a9a5-9a0b-4604-bb9d-27103e414127/maas-api/0.log" Apr 16 15:48:40.397176 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:40.397124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-68fcd65c4f-6r8jr_ece567a0-c607-4b75-9d5f-dd49b1eac224/manager/0.log" Apr 16 15:48:40.492814 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:40.492786 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-68df4b58f7-khzn6_a0decef0-2289-44d3-b69e-9006bab3f5ee/manager/0.log" Apr 16 15:48:40.627346 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:40.627275 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t57q5/perf-node-gather-daemonset-2rssn" Apr 16 15:48:41.896139 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:41.896100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6fc585dfcd-fdfmc_f9bbda6a-86d0-46aa-bf73-a4e09cb4ea07/manager/0.log" Apr 16 15:48:41.920007 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:41.919980 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-k8tfb_50b12eb1-0349-4174-91ff-cec8fd00e9df/openshift-lws-operator/0.log" Apr 16 15:48:47.889808 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.889782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/kube-multus-additional-cni-plugins/0.log" Apr 16 15:48:47.911600 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.911574 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/egress-router-binary-copy/0.log" Apr 16 15:48:47.931308 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.931282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/cni-plugins/0.log" Apr 16 15:48:47.956253 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.956232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/bond-cni-plugin/0.log" Apr 16 15:48:47.976182 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.976164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/routeoverride-cni/0.log" Apr 16 15:48:47.996100 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:47.996081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/whereabouts-cni-bincopy/0.log" Apr 16 15:48:48.015941 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:48.015924 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k26s9_df3ba38d-f8f3-45ad-90ec-e49f33bed1ff/whereabouts-cni/0.log" Apr 16 15:48:48.071612 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:48.071585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zwlx7_90e76464-4794-4a6f-bdfc-1010042e6181/kube-multus/0.log" Apr 16 15:48:48.134557 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:48.134531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-whwdh_9973bf97-babd-47b9-a129-38dbed119c77/network-metrics-daemon/0.log" Apr 16 15:48:48.154353 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:48.154289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-whwdh_9973bf97-babd-47b9-a129-38dbed119c77/kube-rbac-proxy/0.log" Apr 16 15:48:48.998730 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:48.998700 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-controller/0.log" Apr 16 15:48:49.034805 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.034782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/0.log" Apr 16 15:48:49.047215 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.047192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovn-acl-logging/1.log" Apr 16 15:48:49.081038 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.081020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/kube-rbac-proxy-node/0.log" Apr 16 15:48:49.104977 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.104958 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:48:49.126120 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.126102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/northd/0.log" Apr 16 15:48:49.149986 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.149966 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/nbdb/0.log" Apr 16 15:48:49.181405 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.181379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/sbdb/0.log" Apr 16 15:48:49.299071 ip-10-0-129-254 kubenswrapper[2575]: I0416 15:48:49.298994 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8dtn7_8c646c34-edd0-4bb7-ac77-7a47bafd421b/ovnkube-controller/0.log"