Apr 20 16:20:33.038860 ip-10-0-130-72 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 16:20:33.038872 ip-10-0-130-72 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 16:20:33.038881 ip-10-0-130-72 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 16:20:33.039178 ip-10-0-130-72 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 16:20:43.157455 ip-10-0-130-72 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 16:20:43.157470 ip-10-0-130-72 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1508bbd359d54aa29ac6b1646add34a9 -- Apr 20 16:23:16.770447 ip-10-0-130-72 systemd[1]: Starting Kubernetes Kubelet... Apr 20 16:23:17.174791 ip-10-0-130-72 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:17.174791 ip-10-0-130-72 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 16:23:17.174791 ip-10-0-130-72 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:17.174791 ip-10-0-130-72 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 16:23:17.176136 ip-10-0-130-72 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:17.176956 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.176878 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181652 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181672 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181676 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181679 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181682 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:17.181675 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181685 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181688 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181691 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181693 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181696 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181699 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181702 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181704 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181707 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181709 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181712 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181714 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181717 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181720 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181722 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181725 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181728 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181730 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181733 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181735 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:17.181904 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181738 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181748 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181750 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181753 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181755 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181758 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181760 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181763 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181765 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181768 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181770 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181773 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181775 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181778 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181783 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181787 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181791 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181793 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181796 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:17.182396 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181799 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181802 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181805 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181808 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181810 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181813 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181815 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181818 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181822 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181824 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181827 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181829 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181832 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181834 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181837 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181839 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181842 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181844 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181847 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181849 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:17.182844 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181853 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181858 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181861 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181864 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181866 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181869 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181872 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181875 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181878 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181883 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181885 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181888 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181890 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181893 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181896 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181899 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181901 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181904 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181906 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181909 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:17.183322 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181912 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.181914 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182335 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182341 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182345 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182347 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182350 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182353 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182356 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182358 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182361 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182363 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182366 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182368 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182371 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182373 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182376 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182378 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182381 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182384 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:17.183882 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182387 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182390 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182392 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182395 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182398 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182400 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182406 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182409 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182412 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182415 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182419 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182422 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182425 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182429 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182432 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182435 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182439 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182442 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:17.184384 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182445 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182448 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182451 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182453 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182456 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182458 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182461 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182463 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182466 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182468 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182470 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182473 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182475 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182478 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182481 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182484 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182486 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182489 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182491 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182494 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:17.184869 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182496 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182499 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182501 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182504 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182507 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182509 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182511 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182514 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182517 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182519 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182522 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182524 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182527 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182529 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182531 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182534 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182536 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182539 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182541 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182545 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:17.185371 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182547 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182549 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182552 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182554 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182557 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182560 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182562 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182565 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182568 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.182570 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182644 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182652 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182658 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182663 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182668 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182671 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182675 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182679 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182692 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182696 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182700 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182703 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 16:23:17.185871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182707 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182710 2571 flags.go:64] FLAG: --cgroup-root="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182712 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182716 2571 flags.go:64] FLAG: --client-ca-file="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182719 2571 flags.go:64] FLAG: --cloud-config="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182721 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182724 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182729 2571 flags.go:64] FLAG: --cluster-domain="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182731 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182735 2571 flags.go:64] FLAG: --config-dir="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182738 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182741 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182745 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182748 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182751 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182754 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182757 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182760 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182763 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182767 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182769 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182774 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182777 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182780 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182782 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 16:23:17.186417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182786 2571 flags.go:64] FLAG: --enable-server="true" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182789 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182792 2571 flags.go:64] FLAG: --event-burst="100" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182795 2571 flags.go:64] FLAG: --event-qps="50" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182799 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182802 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182805 2571 flags.go:64] FLAG: --eviction-hard="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182810 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182813 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182815 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182818 2571 flags.go:64] FLAG: --eviction-soft="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182821 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182824 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182827 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182830 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182833 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182836 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182839 2571 flags.go:64] FLAG: --feature-gates="" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182843 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182846 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182849 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182852 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182855 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182858 2571 flags.go:64] FLAG: --help="false" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182861 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.187011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182864 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182867 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182870 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182873 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182876 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182879 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182882 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182885 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182888 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182891 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182894 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182901 2571 flags.go:64] FLAG: --kube-reserved="" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182904 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182906 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182911 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182914 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182917 2571 flags.go:64] FLAG: --lock-file="" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182920 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182922 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182925 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182931 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182934 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182936 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 16:23:17.187632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182939 2571 flags.go:64] FLAG: --logging-format="text" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182942 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182945 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182948 2571 flags.go:64] FLAG: --manifest-url="" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182951 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182955 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182958 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182962 2571 flags.go:64] FLAG: --max-pods="110" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182965 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182968 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182970 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182973 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182976 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182979 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182982 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182989 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182992 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182995 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.182998 2571 flags.go:64] FLAG: --pod-cidr="" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183001 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183006 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183010 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183013 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183017 2571 flags.go:64] FLAG: --port="10250" Apr 20 16:23:17.188202 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183020 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183022 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a53fbe2a61f8c9e3" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183025 2571 flags.go:64] FLAG: --qos-reserved="" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183028 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183031 2571 flags.go:64] FLAG: --register-node="true" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183034 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183037 2571 flags.go:64] FLAG: --register-with-taints="" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183040 2571 flags.go:64] FLAG: --registry-burst="10" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183043 2571 flags.go:64] FLAG: --registry-qps="5" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183046 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183049 2571 flags.go:64] FLAG: --reserved-memory="" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183052 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183055 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183058 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183061 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183064 2571 flags.go:64] FLAG: --runonce="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183067 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183070 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183072 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183075 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183078 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183081 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183084 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183087 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183090 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183092 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 16:23:17.188770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183095 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183098 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183101 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183105 2571 flags.go:64] FLAG: --system-cgroups="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183107 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183114 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183116 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183119 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183123 2571 flags.go:64] FLAG: --tls-min-version="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183126 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183128 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183131 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183134 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183137 2571 flags.go:64] FLAG: --v="2" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183141 2571 flags.go:64] FLAG: --version="false" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183145 2571 flags.go:64] FLAG: --vmodule="" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183149 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183152 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183260 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183264 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183267 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183270 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183273 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:17.189418 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183276 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183278 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183282 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183284 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183287 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183289 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183292 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183294 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183297 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183299 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183302 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183305 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183308 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183311 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183314 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183317 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183319 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183323 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183326 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:17.189968 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183329 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183332 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183334 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183337 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183339 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183341 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183344 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183346 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183349 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183351 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183353 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183356 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183359 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183361 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183365 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183368 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183370 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183373 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183376 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183378 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:17.190471 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183381 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183383 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183386 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183388 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183390 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183394 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183397 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183401 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183403 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183406 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183408 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183411 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183413 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183416 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183418 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183421 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183423 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183425 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183428 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183430 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:17.190955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183433 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183435 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183438 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183441 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183443 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183446 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183448 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183450 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183453 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183456 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183458 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183460 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183463 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183465 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183468 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183470 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183473 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183477 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183481 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183485 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:17.191466 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183487 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:17.191945 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.183490 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:17.191945 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.183495 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:17.191945 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.191919 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 16:23:17.191945 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.191938 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.191987 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.191992 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.191996 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.191999 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192002 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192005 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192007 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192010 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192013 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192015 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192018 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192021 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192023 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192026 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192028 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192031 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192034 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192036 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192039 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:17.192049 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192042 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192044 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192047 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192049 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192052 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192055 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192058 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192061 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192064 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192066 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192069 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192071 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192074 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192076 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192078 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192081 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192084 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192086 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192105 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:17.192551 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192109 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192112 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192115 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192117 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192120 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192122 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192125 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192127 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192129 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192132 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192134 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192137 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192141 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192145 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192147 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192151 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192153 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192156 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192159 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192179 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:17.193039 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192183 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192186 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192188 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192191 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192193 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192196 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192198 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192201 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192203 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192205 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192208 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192211 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192213 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192216 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192220 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192223 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192226 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192228 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192231 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:17.193545 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192234 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192237 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192240 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192242 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192245 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192247 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192250 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192252 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192255 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.192260 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192359 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192364 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192367 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192370 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192373 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192376 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:17.194012 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192378 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192381 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192383 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192385 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192388 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192390 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192392 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192395 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192398 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192401 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192403 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192406 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192408 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192411 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192413 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192416 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192418 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192420 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192423 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192425 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:17.194509 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192428 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192430 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192433 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192436 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192438 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192440 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192443 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192445 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192448 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192450 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192453 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192456 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192458 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192460 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192463 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192465 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192467 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192470 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192472 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:17.194985 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192475 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192477 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192480 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192482 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192484 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192487 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192489 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192492 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192494 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192497 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192499 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192501 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192504 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192506 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192509 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192511 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192513 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192517 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192520 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192523 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:17.195459 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192526 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192528 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192530 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192533 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192535 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192541 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192544 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192546 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192548 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192552 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192554 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192557 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192559 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192562 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192565 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192567 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192571 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192574 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192577 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:17.195939 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192579 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:17.196403 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:17.192582 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:17.196403 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.192587 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:17.196403 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.193370 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 16:23:17.196403 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.196379 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 16:23:17.197335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.197322 2571 server.go:1019] "Starting client certificate rotation" Apr 20 16:23:17.197437 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.197420 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:17.197476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.197466 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:17.220213 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.220192 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:17.222620 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.222602 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:17.235085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.235059 2571 log.go:25] "Validated CRI v1 runtime API" Apr 20 16:23:17.241223 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.241206 2571 log.go:25] "Validated CRI v1 image API" Apr 20 16:23:17.246360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.246343 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 16:23:17.251505 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.251486 2571 fs.go:135] Filesystem UUIDs: map[662b4eb3-9a45-46a1-adfb-933f2a7c1a34:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 82f7d5de-85b1-4fbe-814a-bb9d87ae88f7:/dev/nvme0n1p3] Apr 20 16:23:17.251553 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.251507 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 16:23:17.254176 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.254147 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:17.257357 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.257250 2571 manager.go:217] Machine: {Timestamp:2026-04-20 16:23:17.255301656 +0000 UTC m=+0.375478167 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099771 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f971deb8d7a0506079540ff9d0382 SystemUUID:ec2f971d-eb8d-7a05-0607-9540ff9d0382 BootID:1508bbd3-59d5-4aa2-9ac6-b1646add34a9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:73:7a:35:0c:87 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:73:7a:35:0c:87 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d6:db:3d:5f:f8:f2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 16:23:17.257357 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.257353 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 16:23:17.257469 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.257457 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 16:23:17.258367 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.258342 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 16:23:17.258512 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.258370 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-72.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 16:23:17.258555 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.258522 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 16:23:17.258555 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.258530 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 16:23:17.258555 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.258542 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:17.259671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.259661 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:17.260759 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.260750 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:17.260869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.260860 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 16:23:17.263881 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.263872 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 20 16:23:17.263914 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.263890 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 16:23:17.263914 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.263906 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 16:23:17.263963 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.263918 2571 kubelet.go:397] "Adding apiserver pod source" Apr 20 16:23:17.263963 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.263930 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 16:23:17.264949 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.264938 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:17.264997 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.264955 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:17.267925 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.267890 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r8h9p" Apr 20 16:23:17.268019 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.267930 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 16:23:17.269488 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.269473 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 16:23:17.271467 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271451 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 16:23:17.271467 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271469 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271475 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271481 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271486 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271492 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271498 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271504 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271511 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271516 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271524 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 16:23:17.271669 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.271533 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 16:23:17.272021 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.271771 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 16:23:17.272021 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.271798 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-72.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 16:23:17.272370 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.272355 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 16:23:17.272432 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.272375 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 16:23:17.274418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.274400 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-72.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 16:23:17.276000 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.275986 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 16:23:17.276084 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.276029 2571 server.go:1295] "Started kubelet" Apr 20 16:23:17.276134 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.276085 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 16:23:17.276237 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.276192 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 16:23:17.276289 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.276257 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 16:23:17.276698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.276565 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r8h9p" Apr 20 16:23:17.276880 ip-10-0-130-72 systemd[1]: Started Kubernetes Kubelet. Apr 20 16:23:17.277601 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.277572 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 16:23:17.282407 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.282388 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 20 16:23:17.287332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.287311 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 16:23:17.287429 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.287347 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:17.287858 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.287834 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 16:23:17.287942 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.287900 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 16:23:17.287942 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.287919 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 16:23:17.288083 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288066 2571 factory.go:55] Registering systemd factory Apr 20 16:23:17.288139 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.288075 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.288139 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288053 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 16:23:17.288139 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288094 2571 factory.go:223] Registration of the systemd container factory successfully Apr 20 16:23:17.288139 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288138 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 20 16:23:17.288343 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288147 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 20 16:23:17.288479 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288465 2571 factory.go:153] Registering CRI-O factory Apr 20 16:23:17.288546 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288483 2571 factory.go:223] Registration of the crio container factory successfully Apr 20 16:23:17.288596 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288562 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 16:23:17.288596 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288588 2571 factory.go:103] Registering Raw factory Apr 20 16:23:17.288687 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.288602 2571 manager.go:1196] Started watching for new ooms in manager Apr 20 16:23:17.289019 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.289004 2571 manager.go:319] Starting recovery of all containers Apr 20 16:23:17.295476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.295455 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:17.296689 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.296658 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 16:23:17.298456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.298439 2571 manager.go:324] Recovery completed Apr 20 16:23:17.298628 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.298578 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-72.ec2.internal\" not found" node="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.303803 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.303788 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.306308 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306294 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.306372 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306321 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.306372 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306330 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.306823 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306811 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 16:23:17.306888 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306824 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 16:23:17.306888 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.306842 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:17.309552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.309538 2571 policy_none.go:49] "None policy: Start" Apr 20 16:23:17.309628 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.309556 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 16:23:17.309628 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.309570 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 20 16:23:17.346749 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.346733 2571 manager.go:341] "Starting Device Plugin manager" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.346764 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.346776 2571 server.go:85] "Starting device plugin registration server" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.347021 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.347032 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.347133 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.347216 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.347223 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.347776 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 16:23:17.359470 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.347809 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.417359 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.417334 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 16:23:17.417500 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.417371 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 16:23:17.417500 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.417393 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 16:23:17.417500 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.417401 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 16:23:17.417500 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.417439 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 16:23:17.419671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.419650 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:17.447630 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.447569 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.449490 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.449474 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.449561 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.449506 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.449561 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.449522 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.449561 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.449550 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.457203 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.457158 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.457249 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.457209 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-72.ec2.internal\": node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.468290 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.468271 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.518289 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.518264 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal"] Apr 20 16:23:17.518368 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.518351 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.519279 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.519261 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.519348 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.519292 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.519348 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.519302 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.520479 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.520465 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.520618 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.520604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.520668 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.520634 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.521594 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521579 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.521674 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521608 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.521674 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521633 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.521674 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521578 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.521792 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521689 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.521792 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.521703 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.522852 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.522838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.522914 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.522862 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:17.523508 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.523489 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:17.523612 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.523514 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:17.523612 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.523530 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:17.546854 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.546831 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-72.ec2.internal\" not found" node="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.551455 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.551438 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-72.ec2.internal\" not found" node="ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.568567 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.568547 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.589332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.589300 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.668783 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.668754 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.690190 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.690152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.690263 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.690199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.690263 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.690218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34da2f3dcf91c4f795ad835e4ed72c8c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-72.ec2.internal\" (UID: \"34da2f3dcf91c4f795ad835e4ed72c8c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.690333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.690261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.769577 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.769513 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.790872 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.790850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.790943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.790879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34da2f3dcf91c4f795ad835e4ed72c8c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-72.ec2.internal\" (UID: \"34da2f3dcf91c4f795ad835e4ed72c8c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.790943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.790916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34da2f3dcf91c4f795ad835e4ed72c8c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-72.ec2.internal\" (UID: \"34da2f3dcf91c4f795ad835e4ed72c8c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.791036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.790971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fed4ba1c9c7c2657cac867280ba4f485-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal\" (UID: \"fed4ba1c9c7c2657cac867280ba4f485\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.851041 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.851007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.854926 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:17.854908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:17.869987 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.869965 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:17.970578 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:17.970538 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:18.071152 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.071054 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-72.ec2.internal\" not found" Apr 20 16:23:18.158032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.158002 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:18.188355 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.188323 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" Apr 20 16:23:18.196000 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.195973 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:18.197003 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.196988 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 16:23:18.197148 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.197119 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:18.197283 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.197181 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:18.197283 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.197183 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:18.198303 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.198287 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" Apr 20 16:23:18.215635 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.215613 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:18.264979 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.264940 2571 apiserver.go:52] "Watching apiserver" Apr 20 16:23:18.272248 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.272224 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 16:23:18.274181 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.274134 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nhdcv","kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal","openshift-cluster-node-tuning-operator/tuned-jk2dm","openshift-multus/multus-9cgxf","openshift-network-operator/iptables-alerter-k2rfm","openshift-ovn-kubernetes/ovnkube-node-45msc","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr","openshift-dns/node-resolver-hkvft","openshift-image-registry/node-ca-cklt7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal","openshift-multus/multus-additional-cni-plugins-jl5fs","openshift-multus/network-metrics-daemon-rxwd9","openshift-network-diagnostics/network-check-target-t7sxd"] Apr 20 16:23:18.275638 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.275620 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.276655 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.276633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.278187 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.278153 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 16:23:18.278346 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.278331 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.278442 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.278428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.278687 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.278673 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.278746 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.278732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.279284 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.279265 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.279345 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.279329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dbrc9\"" Apr 20 16:23:18.279452 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.279436 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vgjrd\"" Apr 20 16:23:18.279553 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.279539 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.279989 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.279972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.281751 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q9l2q\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.281855 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 16:18:17 +0000 UTC" deadline="2028-01-07 22:02:20.348917566 +0000 UTC" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.281886 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15053h39m2.067035527s" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282047 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282100 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7p46\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282212 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282107 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282369 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.282459 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.283060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283030 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.283630 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283040 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 16:23:18.283903 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283091 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9ln2x\"" Apr 20 16:23:18.284010 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283359 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.284010 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283993 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 16:23:18.284154 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283516 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 16:23:18.284154 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.283518 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 16:23:18.284154 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.284069 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.284953 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.284927 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.285043 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.285029 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.286804 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.286787 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.287394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287363 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.287394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287374 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 16:23:18.287394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287388 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n4wl7\"" Apr 20 16:23:18.287559 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287430 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bn2x\"" Apr 20 16:23:18.287559 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287465 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:18.287559 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287388 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.287660 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.287464 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 16:23:18.288583 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.288563 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.289957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.289939 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.290037 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.290004 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:18.290300 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.290284 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 16:23:18.290300 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.290295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 16:23:18.290431 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.290288 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 16:23:18.290529 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.290515 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7czc8\"" Apr 20 16:23:18.291010 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.290983 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 16:23:18.291084 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.291035 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vkvxh\"" Apr 20 16:23:18.291129 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.291094 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 16:23:18.291487 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.291322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:18.291487 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.291376 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:18.292940 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.292920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-hostroot\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.293042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.292945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-slash\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.292959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.292974 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.293042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.292994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-tuned\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-cnibin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-config\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssndj\" (UniqueName: \"kubernetes.io/projected/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-kube-api-access-ssndj\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-system-cni-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-var-lib-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293261 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-node-log\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-env-overrides\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293300 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovn-node-metrics-cert\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-etc-kubernetes\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9b65\" (UniqueName: \"kubernetes.io/projected/a0a53203-f6d4-43f0-a422-5ae876b369f1-kube-api-access-r9b65\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-conf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-run\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293390 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-daemon-config\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-tmp-dir\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-modprobe-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-etc-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293491 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-netd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-hosts-file\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-sys-fs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.293572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-systemd\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-socket-dir-parent\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/842542e9-94b5-494f-8110-018afb1c0a5f-host\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/842542e9-94b5-494f-8110-018afb1c0a5f-serviceca\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-kubelet\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-cni-binary-copy\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77flx\" (UniqueName: \"kubernetes.io/projected/b1f64d16-8a19-4426-9f62-eaf3e9325026-kube-api-access-77flx\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-os-release\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-registration-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-var-lib-kubelet\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb32dc66-328e-4d49-979d-786a949e2c75-iptables-alerter-script\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-os-release\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-k8s-cni-cncf-io\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-multus-certs\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-netns\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-script-lib\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-kubernetes\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de954181-20e6-42cb-ac40-d96f0331e7a1-konnectivity-ca\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.293996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-device-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-sys\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-host\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ctf\" (UniqueName: \"kubernetes.io/projected/1fa78497-69b5-4855-bf47-cfc3a545a594-kube-api-access-74ctf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-netns\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-bin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-systemd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dvx\" (UniqueName: \"kubernetes.io/projected/842542e9-94b5-494f-8110-018afb1c0a5f-kube-api-access-42dvx\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-tmp\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb32dc66-328e-4d49-979d-786a949e2c75-host-slash\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-conf-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294337 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlcs\" (UniqueName: \"kubernetes.io/projected/e8e73c9b-2851-47c9-a72f-36ab0b948444-kube-api-access-4dlcs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysconfig\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294397 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-log-socket\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7p6\" (UniqueName: \"kubernetes.io/projected/975fe906-de1d-4b78-9555-abc5fd12991c-kube-api-access-pk7p6\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de954181-20e6-42cb-ac40-d96f0331e7a1-agent-certs\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmws8\" (UniqueName: \"kubernetes.io/projected/bb32dc66-328e-4d49-979d-786a949e2c75-kube-api-access-wmws8\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-systemd-units\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-bin\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-ovn\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-socket-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-lib-modules\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-system-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-multus\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294647 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-cnibin\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.294982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.294660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-kubelet\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.302021 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.302000 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:18.321892 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.321846 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-79b4p" Apr 20 16:23:18.329391 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.329370 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-79b4p" Apr 20 16:23:18.354308 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.354278 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed4ba1c9c7c2657cac867280ba4f485.slice/crio-1dcb9c2b1c702e82f7973cd5eb9d6397ca3a82b0a128219db2762566a191d4c3 WatchSource:0}: Error finding container 1dcb9c2b1c702e82f7973cd5eb9d6397ca3a82b0a128219db2762566a191d4c3: Status 404 returned error can't find the container with id 1dcb9c2b1c702e82f7973cd5eb9d6397ca3a82b0a128219db2762566a191d4c3 Apr 20 16:23:18.354664 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.354648 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34da2f3dcf91c4f795ad835e4ed72c8c.slice/crio-416b2fd42d2f5602b600dfde4fc4a656657a2fd1ba6aa01b7f7b737a2baed574 WatchSource:0}: Error finding container 416b2fd42d2f5602b600dfde4fc4a656657a2fd1ba6aa01b7f7b737a2baed574: Status 404 returned error can't find the container with id 416b2fd42d2f5602b600dfde4fc4a656657a2fd1ba6aa01b7f7b737a2baed574 Apr 20 16:23:18.361799 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.361775 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:23:18.389410 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.389395 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 16:23:18.395676 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-netns\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.395737 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-script-lib\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.395737 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-kubernetes\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.395737 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de954181-20e6-42cb-ac40-d96f0331e7a1-konnectivity-ca\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.395869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.395869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395783 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-netns\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.395869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-kubernetes\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.395869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.395869 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-device-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-sys\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-host\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74ctf\" (UniqueName: \"kubernetes.io/projected/1fa78497-69b5-4855-bf47-cfc3a545a594-kube-api-access-74ctf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-netns\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395960 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-sys\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395981 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-device-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395983 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-host\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.395985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-bin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-bin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-systemd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-netns\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42dvx\" (UniqueName: \"kubernetes.io/projected/842542e9-94b5-494f-8110-018afb1c0a5f-kube-api-access-42dvx\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-tmp\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-systemd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb32dc66-328e-4d49-979d-786a949e2c75-host-slash\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-conf-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396158 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlcs\" (UniqueName: \"kubernetes.io/projected/e8e73c9b-2851-47c9-a72f-36ab0b948444-kube-api-access-4dlcs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysconfig\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-conf-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396308 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de954181-20e6-42cb-ac40-d96f0331e7a1-konnectivity-ca\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-log-socket\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb32dc66-328e-4d49-979d-786a949e2c75-host-slash\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-script-lib\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396365 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396405 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-log-socket\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7p6\" (UniqueName: \"kubernetes.io/projected/975fe906-de1d-4b78-9555-abc5fd12991c-kube-api-access-pk7p6\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.396642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396468 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396471 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysconfig\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de954181-20e6-42cb-ac40-d96f0331e7a1-agent-certs\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmws8\" (UniqueName: \"kubernetes.io/projected/bb32dc66-328e-4d49-979d-786a949e2c75-kube-api-access-wmws8\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-systemd-units\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-bin\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-ovn\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-systemd-units\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-socket-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-lib-modules\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-ovn\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-bin\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-system-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-system-cni-dir\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.397335 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-multus\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-socket-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-cni-multus\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-cnibin\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-cnibin\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-run-ovn-kubernetes\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-lib-modules\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-kubelet\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.396957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-var-lib-kubelet\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-hostroot\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-slash\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-hostroot\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-slash\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-tuned\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-run-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397287 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-cnibin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-cnibin\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397410 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-config\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssndj\" (UniqueName: \"kubernetes.io/projected/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-kube-api-access-ssndj\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-system-cni-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-var-lib-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-node-log\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-env-overrides\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovn-node-metrics-cert\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.398873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vljch\" (UniqueName: \"kubernetes.io/projected/ff512ace-f73c-4265-890e-b43c9ecc782d-kube-api-access-vljch\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-etc-kubernetes\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-system-cni-dir\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-node-log\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-var-lib-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9b65\" (UniqueName: \"kubernetes.io/projected/a0a53203-f6d4-43f0-a422-5ae876b369f1-kube-api-access-r9b65\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-conf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-run\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397811 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-daemon-config\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-tmp-dir\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-etc-kubernetes\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-modprobe-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-etc-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-run\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-netd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-hosts-file\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.397985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-sys-fs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.399456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398027 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-sysctl-conf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-systemd\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398069 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-socket-dir-parent\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovnkube-config\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398095 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/842542e9-94b5-494f-8110-018afb1c0a5f-host\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0a53203-f6d4-43f0-a422-5ae876b369f1-env-overrides\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/842542e9-94b5-494f-8110-018afb1c0a5f-serviceca\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-kubelet\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-modprobe-d\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-etc-openvswitch\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-cni-binary-copy\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77flx\" (UniqueName: \"kubernetes.io/projected/b1f64d16-8a19-4426-9f62-eaf3e9325026-kube-api-access-77flx\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-hosts-file\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-os-release\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-registration-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-var-lib-kubelet\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398342 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb32dc66-328e-4d49-979d-786a949e2c75-iptables-alerter-script\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-os-release\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.399943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-k8s-cni-cncf-io\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-multus-certs\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-daemon-config\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-multus-certs\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-multus-socket-dir-parent\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398536 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-kubelet\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-registration-dir\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-var-lib-kubelet\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/842542e9-94b5-494f-8110-018afb1c0a5f-serviceca\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/975fe906-de1d-4b78-9555-abc5fd12991c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/842542e9-94b5-494f-8110-018afb1c0a5f-host\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e8e73c9b-2851-47c9-a72f-36ab0b948444-sys-fs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398802 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-systemd\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398865 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-os-release\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1f64d16-8a19-4426-9f62-eaf3e9325026-host-run-k8s-cni-cncf-io\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/975fe906-de1d-4b78-9555-abc5fd12991c-os-release\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0a53203-f6d4-43f0-a422-5ae876b369f1-host-cni-netd\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.398998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1f64d16-8a19-4426-9f62-eaf3e9325026-cni-binary-copy\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.400444 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.399041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bb32dc66-328e-4d49-979d-786a949e2c75-iptables-alerter-script\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.400912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.399267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-tmp-dir\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.400912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.399463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-tmp\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.400912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.399946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de954181-20e6-42cb-ac40-d96f0331e7a1-agent-certs\") pod \"konnectivity-agent-nhdcv\" (UID: \"de954181-20e6-42cb-ac40-d96f0331e7a1\") " pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.400912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.400199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0a53203-f6d4-43f0-a422-5ae876b369f1-ovn-node-metrics-cert\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.400912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.400584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fa78497-69b5-4855-bf47-cfc3a545a594-etc-tuned\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.404528 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.404506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7p6\" (UniqueName: \"kubernetes.io/projected/975fe906-de1d-4b78-9555-abc5fd12991c-kube-api-access-pk7p6\") pod \"multus-additional-cni-plugins-jl5fs\" (UID: \"975fe906-de1d-4b78-9555-abc5fd12991c\") " pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.404621 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.404506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dvx\" (UniqueName: \"kubernetes.io/projected/842542e9-94b5-494f-8110-018afb1c0a5f-kube-api-access-42dvx\") pod \"node-ca-cklt7\" (UID: \"842542e9-94b5-494f-8110-018afb1c0a5f\") " pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.404797 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.404781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ctf\" (UniqueName: \"kubernetes.io/projected/1fa78497-69b5-4855-bf47-cfc3a545a594-kube-api-access-74ctf\") pod \"tuned-jk2dm\" (UID: \"1fa78497-69b5-4855-bf47-cfc3a545a594\") " pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.404895 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.404786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmws8\" (UniqueName: \"kubernetes.io/projected/bb32dc66-328e-4d49-979d-786a949e2c75-kube-api-access-wmws8\") pod \"iptables-alerter-k2rfm\" (UID: \"bb32dc66-328e-4d49-979d-786a949e2c75\") " pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.404895 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.404877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlcs\" (UniqueName: \"kubernetes.io/projected/e8e73c9b-2851-47c9-a72f-36ab0b948444-kube-api-access-4dlcs\") pod \"aws-ebs-csi-driver-node-xlmfr\" (UID: \"e8e73c9b-2851-47c9-a72f-36ab0b948444\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.405207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.405190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssndj\" (UniqueName: \"kubernetes.io/projected/2e64cb9b-1c5d-4de7-9ce4-b673e8576d87-kube-api-access-ssndj\") pod \"node-resolver-hkvft\" (UID: \"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87\") " pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.408375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.408353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77flx\" (UniqueName: \"kubernetes.io/projected/b1f64d16-8a19-4426-9f62-eaf3e9325026-kube-api-access-77flx\") pod \"multus-9cgxf\" (UID: \"b1f64d16-8a19-4426-9f62-eaf3e9325026\") " pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.409134 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.409110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9b65\" (UniqueName: \"kubernetes.io/projected/a0a53203-f6d4-43f0-a422-5ae876b369f1-kube-api-access-r9b65\") pod \"ovnkube-node-45msc\" (UID: \"a0a53203-f6d4-43f0-a422-5ae876b369f1\") " pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.420587 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.420544 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" event={"ID":"fed4ba1c9c7c2657cac867280ba4f485","Type":"ContainerStarted","Data":"1dcb9c2b1c702e82f7973cd5eb9d6397ca3a82b0a128219db2762566a191d4c3"} Apr 20 16:23:18.421387 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.421368 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" event={"ID":"34da2f3dcf91c4f795ad835e4ed72c8c","Type":"ContainerStarted","Data":"416b2fd42d2f5602b600dfde4fc4a656657a2fd1ba6aa01b7f7b737a2baed574"} Apr 20 16:23:18.499494 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.499453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vljch\" (UniqueName: \"kubernetes.io/projected/ff512ace-f73c-4265-890e-b43c9ecc782d-kube-api-access-vljch\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.499494 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.499502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.499697 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.499523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:18.499697 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.499630 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:18.499766 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.499737 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:18.999701854 +0000 UTC m=+2.119878369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:18.507972 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.507944 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:18.507972 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.507970 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:18.508122 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.507982 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.508122 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:18.508043 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:19.008024995 +0000 UTC m=+2.128201495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.509902 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.509881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vljch\" (UniqueName: \"kubernetes.io/projected/ff512ace-f73c-4265-890e-b43c9ecc782d-kube-api-access-vljch\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:18.604626 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.604541 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" Apr 20 16:23:18.605729 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.605711 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:18.607615 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.607597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" Apr 20 16:23:18.611474 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.611450 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8e73c9b_2851_47c9_a72f_36ab0b948444.slice/crio-8eeea7e72001d0c7950395945f867267dca6efdb90dd9ba700f7e924f2cbd7bf WatchSource:0}: Error finding container 8eeea7e72001d0c7950395945f867267dca6efdb90dd9ba700f7e924f2cbd7bf: Status 404 returned error can't find the container with id 8eeea7e72001d0c7950395945f867267dca6efdb90dd9ba700f7e924f2cbd7bf Apr 20 16:23:18.614323 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.614299 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa78497_69b5_4855_bf47_cfc3a545a594.slice/crio-56cc4e9eedb1dbf3dd43f653ee3fe864adba44f312f0fd496663674fecb90b77 WatchSource:0}: Error finding container 56cc4e9eedb1dbf3dd43f653ee3fe864adba44f312f0fd496663674fecb90b77: Status 404 returned error can't find the container with id 56cc4e9eedb1dbf3dd43f653ee3fe864adba44f312f0fd496663674fecb90b77 Apr 20 16:23:18.637803 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.637771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9cgxf" Apr 20 16:23:18.643377 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.643353 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f64d16_8a19_4426_9f62_eaf3e9325026.slice/crio-8047f2bb8cf04049af1e6159048d24ca482ae42c58e9b0d77d223901f126aa5c WatchSource:0}: Error finding container 8047f2bb8cf04049af1e6159048d24ca482ae42c58e9b0d77d223901f126aa5c: Status 404 returned error can't find the container with id 8047f2bb8cf04049af1e6159048d24ca482ae42c58e9b0d77d223901f126aa5c Apr 20 16:23:18.656275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.656254 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k2rfm" Apr 20 16:23:18.661827 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.661807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:18.662900 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.662879 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb32dc66_328e_4d49_979d_786a949e2c75.slice/crio-db1d2f10559d6ed7fc70b02daad864585ae44f9d07f283ebfa6b62cc447d7f8e WatchSource:0}: Error finding container db1d2f10559d6ed7fc70b02daad864585ae44f9d07f283ebfa6b62cc447d7f8e: Status 404 returned error can't find the container with id db1d2f10559d6ed7fc70b02daad864585ae44f9d07f283ebfa6b62cc447d7f8e Apr 20 16:23:18.668313 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.668127 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:18.668405 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.668387 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a53203_f6d4_43f0_a422_5ae876b369f1.slice/crio-d77bb9e18f4e1e68857d4c46389e05143fe0813549efc5d90392c16b57d11016 WatchSource:0}: Error finding container d77bb9e18f4e1e68857d4c46389e05143fe0813549efc5d90392c16b57d11016: Status 404 returned error can't find the container with id d77bb9e18f4e1e68857d4c46389e05143fe0813549efc5d90392c16b57d11016 Apr 20 16:23:18.675007 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.674990 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkvft" Apr 20 16:23:18.681257 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.681235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cklt7" Apr 20 16:23:18.682979 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.682953 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e64cb9b_1c5d_4de7_9ce4_b673e8576d87.slice/crio-6b73160ddcdf53d489d7057afb14c6b3136f19d802a34379a20e8a156c1abe97 WatchSource:0}: Error finding container 6b73160ddcdf53d489d7057afb14c6b3136f19d802a34379a20e8a156c1abe97: Status 404 returned error can't find the container with id 6b73160ddcdf53d489d7057afb14c6b3136f19d802a34379a20e8a156c1abe97 Apr 20 16:23:18.684738 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:18.684710 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" Apr 20 16:23:18.688989 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.688972 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842542e9_94b5_494f_8110_018afb1c0a5f.slice/crio-a3c8326d8ed86f9332610e4bbf914256a8ca390a37c2cc8a69a9bf4f622d3e0e WatchSource:0}: Error finding container a3c8326d8ed86f9332610e4bbf914256a8ca390a37c2cc8a69a9bf4f622d3e0e: Status 404 returned error can't find the container with id a3c8326d8ed86f9332610e4bbf914256a8ca390a37c2cc8a69a9bf4f622d3e0e Apr 20 16:23:18.691678 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:23:18.691658 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod975fe906_de1d_4b78_9555_abc5fd12991c.slice/crio-41095f7f31db973acab352f527643e0a9eb8bb7b303921c0912ccc9b959bf19f WatchSource:0}: Error finding container 41095f7f31db973acab352f527643e0a9eb8bb7b303921c0912ccc9b959bf19f: Status 404 returned error can't find the container with id 41095f7f31db973acab352f527643e0a9eb8bb7b303921c0912ccc9b959bf19f Apr 20 16:23:19.004738 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.004549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:19.004738 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.004689 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:19.004954 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.004770 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:20.004738215 +0000 UTC m=+3.124914730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:19.105917 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.105882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:19.106084 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.106058 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:19.106084 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.106078 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:19.106207 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.106093 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:19.106207 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:19.106150 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:20.106131413 +0000 UTC m=+3.226307929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:19.324281 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.323999 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:19.330489 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.330399 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:18 +0000 UTC" deadline="2028-01-11 23:55:53.039354714 +0000 UTC" Apr 20 16:23:19.330489 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.330440 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15151h32m33.708919662s" Apr 20 16:23:19.437123 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.437089 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:19.461765 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.458962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkvft" event={"ID":"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87","Type":"ContainerStarted","Data":"6b73160ddcdf53d489d7057afb14c6b3136f19d802a34379a20e8a156c1abe97"} Apr 20 16:23:19.473871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.473825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k2rfm" event={"ID":"bb32dc66-328e-4d49-979d-786a949e2c75","Type":"ContainerStarted","Data":"db1d2f10559d6ed7fc70b02daad864585ae44f9d07f283ebfa6b62cc447d7f8e"} Apr 20 16:23:19.484534 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.484447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" event={"ID":"e8e73c9b-2851-47c9-a72f-36ab0b948444","Type":"ContainerStarted","Data":"8eeea7e72001d0c7950395945f867267dca6efdb90dd9ba700f7e924f2cbd7bf"} Apr 20 16:23:19.490790 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.490727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerStarted","Data":"41095f7f31db973acab352f527643e0a9eb8bb7b303921c0912ccc9b959bf19f"} Apr 20 16:23:19.500371 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.500308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nhdcv" event={"ID":"de954181-20e6-42cb-ac40-d96f0331e7a1","Type":"ContainerStarted","Data":"5990c9e617477505468193846ac6f3f125fa30f7053ddacfcc7538fa71130a17"} Apr 20 16:23:19.510509 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.510449 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"d77bb9e18f4e1e68857d4c46389e05143fe0813549efc5d90392c16b57d11016"} Apr 20 16:23:19.520521 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.520452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9cgxf" event={"ID":"b1f64d16-8a19-4426-9f62-eaf3e9325026","Type":"ContainerStarted","Data":"8047f2bb8cf04049af1e6159048d24ca482ae42c58e9b0d77d223901f126aa5c"} Apr 20 16:23:19.531813 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.531734 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" event={"ID":"1fa78497-69b5-4855-bf47-cfc3a545a594","Type":"ContainerStarted","Data":"56cc4e9eedb1dbf3dd43f653ee3fe864adba44f312f0fd496663674fecb90b77"} Apr 20 16:23:19.561915 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:19.561872 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cklt7" event={"ID":"842542e9-94b5-494f-8110-018afb1c0a5f","Type":"ContainerStarted","Data":"a3c8326d8ed86f9332610e4bbf914256a8ca390a37c2cc8a69a9bf4f622d3e0e"} Apr 20 16:23:20.013891 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.013857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:20.014134 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.013973 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:20.014134 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.014032 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:22.014013702 +0000 UTC m=+5.134190215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:20.115284 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.115246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:20.115494 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.115430 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:20.115494 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.115447 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:20.115494 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.115459 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:20.115669 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.115517 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:22.115497389 +0000 UTC m=+5.235673890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:20.331191 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.331074 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:18 +0000 UTC" deadline="2028-01-22 20:36:32.080006624 +0000 UTC" Apr 20 16:23:20.331191 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.331115 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15412h13m11.748895282s" Apr 20 16:23:20.418201 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.417588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:20.418201 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.417739 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:20.418619 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.418152 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:20.418619 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:20.418584 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:20.987250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:20.987221 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:22.031985 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:22.031951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:22.032437 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.032131 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:22.032437 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.032209 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:26.032189071 +0000 UTC m=+9.152365574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:22.132294 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:22.132256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:22.132458 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.132411 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:22.132458 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.132428 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:22.132458 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.132441 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:22.132626 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.132493 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:26.132476473 +0000 UTC m=+9.252652985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:22.418347 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:22.418242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:22.418508 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.418371 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:22.418785 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:22.418765 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:22.418887 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:22.418865 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:24.418245 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:24.418205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:24.418695 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:24.418343 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:24.418695 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:24.418222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:24.418695 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:24.418562 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:26.064654 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:26.064610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:26.065157 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.064755 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:26.065157 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.064820 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:34.064800446 +0000 UTC m=+17.184976952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:26.165487 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:26.165444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:26.165674 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.165627 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:26.165674 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.165646 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:26.165674 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.165659 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:26.165788 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.165717 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:34.165698537 +0000 UTC m=+17.285875049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:26.418534 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:26.418500 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:26.418773 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.418635 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:26.419074 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:26.419023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:26.419242 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:26.419113 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:28.417788 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:28.417749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:28.418258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:28.417763 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:28.418258 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:28.417880 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:28.418258 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:28.417998 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:30.418606 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:30.418567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:30.419032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:30.418567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:30.419032 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:30.418704 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:30.419032 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:30.418810 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:32.417991 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:32.417946 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:32.418467 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:32.417947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:32.418467 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:32.418063 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:32.418467 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:32.418148 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:34.125947 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:34.125912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:34.126388 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.126071 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:34.126388 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.126137 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.126117344 +0000 UTC m=+33.246293848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:34.227040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:34.227003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:34.227382 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.227195 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:34.227382 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.227217 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:34.227382 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.227227 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:34.227382 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.227286 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.227268209 +0000 UTC m=+33.347444711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:34.417881 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:34.417795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:34.418045 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.417923 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:34.418045 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:34.417983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:34.418151 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:34.418123 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:36.418446 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:36.418418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:36.418779 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:36.418449 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:36.418779 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:36.418522 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:36.418779 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:36.418578 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:37.599616 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.599380 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" event={"ID":"34da2f3dcf91c4f795ad835e4ed72c8c","Type":"ContainerStarted","Data":"207d697c6f6a3158c8ad634aa623d5a70898501193387a811d5201c49d42c851"} Apr 20 16:23:37.607740 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.607715 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:23:37.608144 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608114 2571 generic.go:358] "Generic (PLEG): container finished" podID="a0a53203-f6d4-43f0-a422-5ae876b369f1" containerID="4744d6380c751573d67366127051ff6631f109a5bc1b671bcf6f58995a7dba36" exitCode=1 Apr 20 16:23:37.608254 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"8f90ea3147cc32a1df4d5e5bdce58366ef211732456df71b1c1a5bdb11b2161b"} Apr 20 16:23:37.608309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"799a06efd0d1848277162825e8934d4652859910e3204a55d588520fd7c01696"} Apr 20 16:23:37.608309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608279 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"bc5b1c80dc12e9ff7b3f4ebdfeaa4bc5807f155448cd2bb570bb8d59eab20635"} Apr 20 16:23:37.608309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608294 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"5dd7f4490e1eb09e94574e9c455008c7c59e197e70fd40e1c11f958f009a6a37"} Apr 20 16:23:37.608309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608307 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerDied","Data":"4744d6380c751573d67366127051ff6631f109a5bc1b671bcf6f58995a7dba36"} Apr 20 16:23:37.608476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.608318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"23ef20fa73f7d98447b722091be9e140e66d41b26108955c8d90d89a44eb8e89"} Apr 20 16:23:37.610369 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.610341 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9cgxf" event={"ID":"b1f64d16-8a19-4426-9f62-eaf3e9325026","Type":"ContainerStarted","Data":"ea4acf34627d1066c36282ad3e1a393f01f2747d4f9b0149eedee882d15919f7"} Apr 20 16:23:37.611602 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.611582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" event={"ID":"1fa78497-69b5-4855-bf47-cfc3a545a594","Type":"ContainerStarted","Data":"7d4fbbf5ab0b4573512061615d34aec283a9ca595420d6c679136e302792d82a"} Apr 20 16:23:37.614245 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.614201 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-72.ec2.internal" podStartSLOduration=19.614185552 podStartE2EDuration="19.614185552s" podCreationTimestamp="2026-04-20 16:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:37.613664657 +0000 UTC m=+20.733841177" watchObservedRunningTime="2026-04-20 16:23:37.614185552 +0000 UTC m=+20.734362086" Apr 20 16:23:37.631103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:37.631010 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jk2dm" podStartSLOduration=2.612231875 podStartE2EDuration="20.630997409s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.615902835 +0000 UTC m=+1.736079332" lastFinishedPulling="2026-04-20 16:23:36.63466835 +0000 UTC m=+19.754844866" observedRunningTime="2026-04-20 16:23:37.630677062 +0000 UTC m=+20.750853584" watchObservedRunningTime="2026-04-20 16:23:37.630997409 +0000 UTC m=+20.751173930" Apr 20 16:23:38.418547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.418501 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:38.418726 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.418501 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:38.418726 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:38.418628 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:38.418726 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:38.418700 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:38.596429 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.596405 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 16:23:38.614393 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.614363 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="691f844a880e11d9bf048f97df2c4e8ba9bea6f2ac9939098168cd0451773c17" exitCode=0 Apr 20 16:23:38.614796 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.614405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"691f844a880e11d9bf048f97df2c4e8ba9bea6f2ac9939098168cd0451773c17"} Apr 20 16:23:38.615851 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.615782 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nhdcv" event={"ID":"de954181-20e6-42cb-ac40-d96f0331e7a1","Type":"ContainerStarted","Data":"7a90e069afb26df43f1b5aeadd53ec51311d86be1adfa146ed1b84d188bab644"} Apr 20 16:23:38.617191 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.617152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cklt7" event={"ID":"842542e9-94b5-494f-8110-018afb1c0a5f","Type":"ContainerStarted","Data":"3eb9985007a4b56c00f7014e0c76752fabbce352e0a9300215b7dcd1e7487214"} Apr 20 16:23:38.618581 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.618560 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkvft" event={"ID":"2e64cb9b-1c5d-4de7-9ce4-b673e8576d87","Type":"ContainerStarted","Data":"249232a4735fdfe3087239359fd034938bd00ad4943a9023fd9fb35c1f657d4a"} Apr 20 16:23:38.619889 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.619858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k2rfm" event={"ID":"bb32dc66-328e-4d49-979d-786a949e2c75","Type":"ContainerStarted","Data":"878b5b5bb3cfb32ec4abee45d097b1e369ab4fb95d3625b22e5bd741be185ff1"} Apr 20 16:23:38.621459 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.621438 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" event={"ID":"e8e73c9b-2851-47c9-a72f-36ab0b948444","Type":"ContainerStarted","Data":"66ce67efb2cb96fb5279693a4deea439d3c0eb747f2c0ac7f2470cfe5cf9c13f"} Apr 20 16:23:38.621547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.621463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" event={"ID":"e8e73c9b-2851-47c9-a72f-36ab0b948444","Type":"ContainerStarted","Data":"ffeff167611f80b154459e6e31e85e04a89942fe2ddd2e1304dc8540481d3fe4"} Apr 20 16:23:38.622680 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.622660 2571 generic.go:358] "Generic (PLEG): container finished" podID="fed4ba1c9c7c2657cac867280ba4f485" containerID="501adaed52b10d3aed17bbbc26d39bdab465e7d4860bf4abacb676a0e3e0a296" exitCode=0 Apr 20 16:23:38.622761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.622735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" event={"ID":"fed4ba1c9c7c2657cac867280ba4f485","Type":"ContainerDied","Data":"501adaed52b10d3aed17bbbc26d39bdab465e7d4860bf4abacb676a0e3e0a296"} Apr 20 16:23:38.635998 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.635963 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9cgxf" podStartSLOduration=3.261219527 podStartE2EDuration="21.635945877s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.644992085 +0000 UTC m=+1.765168582" lastFinishedPulling="2026-04-20 16:23:37.019718433 +0000 UTC m=+20.139894932" observedRunningTime="2026-04-20 16:23:37.650704898 +0000 UTC m=+20.770881420" watchObservedRunningTime="2026-04-20 16:23:38.635945877 +0000 UTC m=+21.756122391" Apr 20 16:23:38.650306 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.650261 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hkvft" podStartSLOduration=3.7417385960000002 podStartE2EDuration="21.650247652s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.68484101 +0000 UTC m=+1.805017508" lastFinishedPulling="2026-04-20 16:23:36.593350065 +0000 UTC m=+19.713526564" observedRunningTime="2026-04-20 16:23:38.649720657 +0000 UTC m=+21.769897178" watchObservedRunningTime="2026-04-20 16:23:38.650247652 +0000 UTC m=+21.770424171" Apr 20 16:23:38.663097 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.663054 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k2rfm" podStartSLOduration=3.7346024030000002 podStartE2EDuration="21.663041072s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.664918921 +0000 UTC m=+1.785095433" lastFinishedPulling="2026-04-20 16:23:36.593357591 +0000 UTC m=+19.713534102" observedRunningTime="2026-04-20 16:23:38.662756826 +0000 UTC m=+21.782933345" watchObservedRunningTime="2026-04-20 16:23:38.663041072 +0000 UTC m=+21.783217594" Apr 20 16:23:38.675828 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.675739 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nhdcv" podStartSLOduration=3.759357929 podStartE2EDuration="21.675725809s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.677974277 +0000 UTC m=+1.798150774" lastFinishedPulling="2026-04-20 16:23:36.594342156 +0000 UTC m=+19.714518654" observedRunningTime="2026-04-20 16:23:38.675689316 +0000 UTC m=+21.795865836" watchObservedRunningTime="2026-04-20 16:23:38.675725809 +0000 UTC m=+21.795902329" Apr 20 16:23:38.689610 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:38.689562 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cklt7" podStartSLOduration=3.748957958 podStartE2EDuration="21.689551237s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.690949803 +0000 UTC m=+1.811126303" lastFinishedPulling="2026-04-20 16:23:36.63154307 +0000 UTC m=+19.751719582" observedRunningTime="2026-04-20 16:23:38.68950967 +0000 UTC m=+21.809686192" watchObservedRunningTime="2026-04-20 16:23:38.689551237 +0000 UTC m=+21.809727757" Apr 20 16:23:39.358449 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.358175 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T16:23:38.596423291Z","UUID":"30786942-9afe-402a-aa51-e15627ef285c","Handler":null,"Name":"","Endpoint":""} Apr 20 16:23:39.359955 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.359929 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 16:23:39.360085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.359963 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 16:23:39.626777 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.626701 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" event={"ID":"e8e73c9b-2851-47c9-a72f-36ab0b948444","Type":"ContainerStarted","Data":"94ad8b1462f0ee26a966957f6c0805e6cf615a997036082d6b7f8c2886929708"} Apr 20 16:23:39.629085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.629057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" event={"ID":"fed4ba1c9c7c2657cac867280ba4f485","Type":"ContainerStarted","Data":"262003b43069745207a5f478409fa646564af05840bddc01bb65cc174dc9af6d"} Apr 20 16:23:39.632101 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.632075 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:23:39.632594 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.632564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"1e8c85e3f5ab9807e961532a400a535eec7b719d04fa1ac86a00c3dad37e048d"} Apr 20 16:23:39.644020 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.643976 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlmfr" podStartSLOduration=1.8828554450000001 podStartE2EDuration="22.643964748s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.613429567 +0000 UTC m=+1.733606064" lastFinishedPulling="2026-04-20 16:23:39.374538861 +0000 UTC m=+22.494715367" observedRunningTime="2026-04-20 16:23:39.643737424 +0000 UTC m=+22.763913949" watchObservedRunningTime="2026-04-20 16:23:39.643964748 +0000 UTC m=+22.764141279" Apr 20 16:23:39.658107 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:39.658059 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-72.ec2.internal" podStartSLOduration=21.658044543 podStartE2EDuration="21.658044543s" podCreationTimestamp="2026-04-20 16:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:39.657731258 +0000 UTC m=+22.777907778" watchObservedRunningTime="2026-04-20 16:23:39.658044543 +0000 UTC m=+22.778221064" Apr 20 16:23:40.418442 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:40.418409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:40.418610 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:40.418409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:40.418610 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:40.418530 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:40.418716 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:40.418649 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:42.417745 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.417707 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:42.418456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.417711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:42.418456 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:42.417838 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:42.418456 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:42.417963 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:42.642501 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.642137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:23:42.642912 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.642617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"7eae93fdcee321256b44d3c8022d339e827baf471a4b4c1e4e1f143635ff1ab0"} Apr 20 16:23:42.643218 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.642929 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:42.643218 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.642956 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:42.643218 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.643082 2571 scope.go:117] "RemoveContainer" containerID="4744d6380c751573d67366127051ff6631f109a5bc1b671bcf6f58995a7dba36" Apr 20 16:23:42.661565 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:42.658822 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:43.244888 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.244805 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:43.245422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.245404 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:43.646420 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.646391 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="3dd0b135044fd0ff6993de5fc46e8bd8a4451f923b5abf574de7d348f76dbf40" exitCode=0 Apr 20 16:23:43.646910 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.646462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"3dd0b135044fd0ff6993de5fc46e8bd8a4451f923b5abf574de7d348f76dbf40"} Apr 20 16:23:43.649824 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.649795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:23:43.650224 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.650199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" event={"ID":"a0a53203-f6d4-43f0-a422-5ae876b369f1","Type":"ContainerStarted","Data":"3ef35012e597b79d8fc7fbdeef2bf93c708d55ee1c6842a26324abff5ddc3cf5"} Apr 20 16:23:43.650556 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.650488 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:43.665705 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.665684 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:23:43.691286 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:43.691247 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" podStartSLOduration=8.687737683 podStartE2EDuration="26.691233811s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.670128699 +0000 UTC m=+1.790305197" lastFinishedPulling="2026-04-20 16:23:36.673624827 +0000 UTC m=+19.793801325" observedRunningTime="2026-04-20 16:23:43.689772616 +0000 UTC m=+26.809949147" watchObservedRunningTime="2026-04-20 16:23:43.691233811 +0000 UTC m=+26.811410331" Apr 20 16:23:44.417876 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.417717 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:44.418032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.417732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:44.418032 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:44.417988 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:44.418032 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:44.418024 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:44.561033 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.560935 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t7sxd"] Apr 20 16:23:44.561553 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.561527 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rxwd9"] Apr 20 16:23:44.654428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.654393 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="3fc2d77cab6363450ae59452f640c3eb33c0e43e804732e0fb7adffc5e3495eb" exitCode=0 Apr 20 16:23:44.654845 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.654507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"3fc2d77cab6363450ae59452f640c3eb33c0e43e804732e0fb7adffc5e3495eb"} Apr 20 16:23:44.654845 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.654544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:44.654845 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:44.654571 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:44.654845 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:44.654643 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:44.654845 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:44.654721 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:45.658927 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:45.658893 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="493601fd6573f7695691732c3edbbbcdb10f388c7b367937c6a4f4c794340971" exitCode=0 Apr 20 16:23:45.659332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:45.658969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"493601fd6573f7695691732c3edbbbcdb10f388c7b367937c6a4f4c794340971"} Apr 20 16:23:46.375548 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:46.375515 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:46.375717 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:46.375676 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 16:23:46.376463 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:46.376430 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nhdcv" Apr 20 16:23:46.418338 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:46.418305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:46.418513 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:46.418318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:46.418513 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:46.418418 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:46.418616 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:46.418564 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:48.418141 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.418106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:48.418840 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.418117 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:48.418840 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:48.418245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7sxd" podUID="6e7ab3fb-b773-40fb-8dc4-b848f28093cb" Apr 20 16:23:48.418840 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:48.418320 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:23:48.767522 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.767438 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-72.ec2.internal" event="NodeReady" Apr 20 16:23:48.767757 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.767585 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 16:23:48.810593 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.810561 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mmqjp"] Apr 20 16:23:48.840287 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.840258 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-np9bb"] Apr 20 16:23:48.840469 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.840447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:48.843269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.843220 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 16:23:48.843269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.843233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:23:48.843269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.843256 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 16:23:48.861144 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.861118 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mmqjp"] Apr 20 16:23:48.861144 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.861144 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-np9bb"] Apr 20 16:23:48.861296 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.861278 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:48.864249 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.864223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 16:23:48.864249 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.864244 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 16:23:48.864422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.864383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 16:23:48.864422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.864387 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:23:48.938805 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:48.938992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d3f1231-a687-4b5c-b5b5-d078c34b831b-config-volume\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:48.938992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:48.938992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d3f1231-a687-4b5c-b5b5-d078c34b831b-tmp-dir\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:48.938992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88478\" (UniqueName: \"kubernetes.io/projected/6da97721-f2ed-4061-b7a4-2577d2b33d11-kube-api-access-88478\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:48.939157 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:48.938995 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwll\" (UniqueName: \"kubernetes.io/projected/1d3f1231-a687-4b5c-b5b5-d078c34b831b-kube-api-access-vrwll\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040114 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88478\" (UniqueName: \"kubernetes.io/projected/6da97721-f2ed-4061-b7a4-2577d2b33d11-kube-api-access-88478\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:49.040114 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwll\" (UniqueName: \"kubernetes.io/projected/1d3f1231-a687-4b5c-b5b5-d078c34b831b-kube-api-access-vrwll\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040114 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d3f1231-a687-4b5c-b5b5-d078c34b831b-config-volume\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d3f1231-a687-4b5c-b5b5-d078c34b831b-tmp-dir\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.040256 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.040329 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:49.540307736 +0000 UTC m=+32.660484251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.040339 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:49.040378 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.040378 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:49.540366262 +0000 UTC m=+32.660542761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:23:49.040604 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d3f1231-a687-4b5c-b5b5-d078c34b831b-tmp-dir\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.040795 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.040779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d3f1231-a687-4b5c-b5b5-d078c34b831b-config-volume\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.052336 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.052223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwll\" (UniqueName: \"kubernetes.io/projected/1d3f1231-a687-4b5c-b5b5-d078c34b831b-kube-api-access-vrwll\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.052447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.052281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88478\" (UniqueName: \"kubernetes.io/projected/6da97721-f2ed-4061-b7a4-2577d2b33d11-kube-api-access-88478\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:49.544885 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.544849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:49.545352 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:49.544905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:49.545352 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.544999 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:49.545352 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.545005 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:49.545352 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.545057 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.545040517 +0000 UTC m=+33.665217018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:23:49.545352 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:49.545072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.545066207 +0000 UTC m=+33.665242705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:23:50.148850 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.148808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:50.149027 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.148965 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:50.149081 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.149037 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:24:22.149019365 +0000 UTC m=+65.269195863 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:50.249586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.249551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:50.249781 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.249696 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:50.249781 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.249714 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:50.249781 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.249724 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mv65f for pod openshift-network-diagnostics/network-check-target-t7sxd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:50.249781 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.249780 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f podName:6e7ab3fb-b773-40fb-8dc4-b848f28093cb nodeName:}" failed. No retries permitted until 2026-04-20 16:24:22.249766141 +0000 UTC m=+65.369942639 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mv65f" (UniqueName: "kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f") pod "network-check-target-t7sxd" (UID: "6e7ab3fb-b773-40fb-8dc4-b848f28093cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:50.417888 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.417805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:23:50.418041 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.417814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:23:50.422282 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.422230 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:23:50.423500 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.423473 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:23:50.423661 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.422238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:23:50.423764 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.422240 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:23:50.426802 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.422238 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:23:50.551868 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.551827 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:50.552287 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:50.551932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:50.552287 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.552007 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:50.552287 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.552050 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:50.552287 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.552072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:52.552057108 +0000 UTC m=+35.672233606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:23:50.552287 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:50.552097 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:52.552080197 +0000 UTC m=+35.672256710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:23:52.568661 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:52.568625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:52.569059 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:52.568684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:52.569059 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:52.568774 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:52.569059 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:52.568777 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:52.569059 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:52.568823 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:56.568809839 +0000 UTC m=+39.688986337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:23:52.569059 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:52.568836 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:56.56883094 +0000 UTC m=+39.689007438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:23:52.675004 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:52.674971 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="f3acc18e1f4acfdd5c7b55acfc395c3bb44ca105574b6eb2d380eae8e1f621ec" exitCode=0 Apr 20 16:23:52.675151 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:52.675021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"f3acc18e1f4acfdd5c7b55acfc395c3bb44ca105574b6eb2d380eae8e1f621ec"} Apr 20 16:23:53.681255 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:53.681221 2571 generic.go:358] "Generic (PLEG): container finished" podID="975fe906-de1d-4b78-9555-abc5fd12991c" containerID="3c5b7c1dbef0bd593e511f8c0096fd5de0b9001295b2861636bc2528eb2eb4d3" exitCode=0 Apr 20 16:23:53.681674 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:53.681295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerDied","Data":"3c5b7c1dbef0bd593e511f8c0096fd5de0b9001295b2861636bc2528eb2eb4d3"} Apr 20 16:23:54.687110 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:54.687077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" event={"ID":"975fe906-de1d-4b78-9555-abc5fd12991c","Type":"ContainerStarted","Data":"b250e6726ff5d69f1348377ccd584471777f0b75708b40aea66adb0f4a849b58"} Apr 20 16:23:54.717064 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:54.717016 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jl5fs" podStartSLOduration=4.721352397 podStartE2EDuration="37.717004325s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.69336723 +0000 UTC m=+1.813543732" lastFinishedPulling="2026-04-20 16:23:51.689019162 +0000 UTC m=+34.809195660" observedRunningTime="2026-04-20 16:23:54.715431146 +0000 UTC m=+37.835607668" watchObservedRunningTime="2026-04-20 16:23:54.717004325 +0000 UTC m=+37.837180845" Apr 20 16:23:56.598839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:56.598799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:23:56.599252 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:23:56.598863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:23:56.599252 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:56.598951 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:56.599252 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:56.598963 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:56.599252 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:56.599019 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:04.59900208 +0000 UTC m=+47.719178583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:23:56.599252 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:23:56.599033 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:04.599027324 +0000 UTC m=+47.719203822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:24:04.652244 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:04.652203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:24:04.652803 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:04.652261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:24:04.652803 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:04.652368 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:04.652803 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:04.652417 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:04.652803 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:04.652430 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:20.652415235 +0000 UTC m=+63.772591732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:24:04.652803 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:04.652522 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:20.652498961 +0000 UTC m=+63.772675463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:24:15.676843 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:15.676806 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-45msc" Apr 20 16:24:20.661451 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:20.661402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:24:20.661907 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:20.661474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:24:20.661907 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:20.661538 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:20.661907 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:20.661549 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:20.661907 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:20.661600 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:52.661584645 +0000 UTC m=+95.781761142 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:24:20.661907 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:20.661615 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:52.661608655 +0000 UTC m=+95.781785152 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:24:22.169269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.169236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:24:22.172597 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.172574 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:24:22.180143 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:22.180121 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:24:22.180240 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:22.180213 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:25:26.18018893 +0000 UTC m=+129.300365432 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : secret "metrics-daemon-secret" not found Apr 20 16:24:22.269946 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.269906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:24:22.272826 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.272800 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:24:22.283520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.283498 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:24:22.295407 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.295384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv65f\" (UniqueName: \"kubernetes.io/projected/6e7ab3fb-b773-40fb-8dc4-b848f28093cb-kube-api-access-mv65f\") pod \"network-check-target-t7sxd\" (UID: \"6e7ab3fb-b773-40fb-8dc4-b848f28093cb\") " pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:24:22.534255 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.534177 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:24:22.541453 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.541428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:24:22.707243 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.707212 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t7sxd"] Apr 20 16:24:22.711607 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:24:22.711574 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7ab3fb_b773_40fb_8dc4_b848f28093cb.slice/crio-cb2ac5ead386a645b600d03173f061023f3133ec449ff706f23ea5365c8d5fbe WatchSource:0}: Error finding container cb2ac5ead386a645b600d03173f061023f3133ec449ff706f23ea5365c8d5fbe: Status 404 returned error can't find the container with id cb2ac5ead386a645b600d03173f061023f3133ec449ff706f23ea5365c8d5fbe Apr 20 16:24:22.742045 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:22.742014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t7sxd" event={"ID":"6e7ab3fb-b773-40fb-8dc4-b848f28093cb","Type":"ContainerStarted","Data":"cb2ac5ead386a645b600d03173f061023f3133ec449ff706f23ea5365c8d5fbe"} Apr 20 16:24:25.750084 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:25.750038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t7sxd" event={"ID":"6e7ab3fb-b773-40fb-8dc4-b848f28093cb","Type":"ContainerStarted","Data":"78361a3b8db0982bcbf0dcc46c03aad63cacb2f5666f62bbef997ce7b9c44d6e"} Apr 20 16:24:25.750567 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:25.750179 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:24:25.767266 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:25.767222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t7sxd" podStartSLOduration=65.924508008 podStartE2EDuration="1m8.76721049s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:24:22.713484496 +0000 UTC m=+65.833660994" lastFinishedPulling="2026-04-20 16:24:25.556186971 +0000 UTC m=+68.676363476" observedRunningTime="2026-04-20 16:24:25.766336256 +0000 UTC m=+68.886512777" watchObservedRunningTime="2026-04-20 16:24:25.76721049 +0000 UTC m=+68.887387030" Apr 20 16:24:52.674486 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:52.674379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:24:52.674486 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:52.674428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:24:52.674953 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:52.674527 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:52.674953 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:52.674527 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:52.674953 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:52.674590 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert podName:6da97721-f2ed-4061-b7a4-2577d2b33d11 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:56.674571742 +0000 UTC m=+159.794748242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert") pod "ingress-canary-np9bb" (UID: "6da97721-f2ed-4061-b7a4-2577d2b33d11") : secret "canary-serving-cert" not found Apr 20 16:24:52.674953 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:24:52.674606 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls podName:1d3f1231-a687-4b5c-b5b5-d078c34b831b nodeName:}" failed. No retries permitted until 2026-04-20 16:25:56.674598285 +0000 UTC m=+159.794774782 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls") pod "dns-default-mmqjp" (UID: "1d3f1231-a687-4b5c-b5b5-d078c34b831b") : secret "dns-default-metrics-tls" not found Apr 20 16:24:56.754487 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:24:56.754461 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t7sxd" Apr 20 16:25:26.201001 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:26.200947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:25:26.201502 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:26.201113 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:25:26.201502 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:26.201212 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs podName:ff512ace-f73c-4265-890e-b43c9ecc782d nodeName:}" failed. No retries permitted until 2026-04-20 16:27:28.201194519 +0000 UTC m=+251.321371016 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs") pod "network-metrics-daemon-rxwd9" (UID: "ff512ace-f73c-4265-890e-b43c9ecc782d") : secret "metrics-daemon-secret" not found Apr 20 16:25:32.736865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.736831 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v"] Apr 20 16:25:32.739733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.739711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" Apr 20 16:25:32.739916 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.739896 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc"] Apr 20 16:25:32.742255 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.742239 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bmpdr\"" Apr 20 16:25:32.742666 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.742653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.743351 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.743333 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.743453 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.743334 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.744802 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.744786 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 16:25:32.745239 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.745219 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.745349 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.745228 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.745424 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.745375 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4sp4j\"" Apr 20 16:25:32.748855 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.748832 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v"] Apr 20 16:25:32.752302 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.752280 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc"] Apr 20 16:25:32.844827 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.844794 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh"] Apr 20 16:25:32.847050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.847028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vbd\" (UniqueName: \"kubernetes.io/projected/a9278d9f-91d1-433a-b1fa-73f5e63273b2-kube-api-access-55vbd\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.847147 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.847090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nxr9\" (UniqueName: \"kubernetes.io/projected/79391e4d-48bf-4541-a5e5-ad615c67502e-kube-api-access-7nxr9\") pod \"volume-data-source-validator-7c6cbb6c87-q5q8v\" (UID: \"79391e4d-48bf-4541-a5e5-ad615c67502e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" Apr 20 16:25:32.847147 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.847124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.847668 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.847642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb"] Apr 20 16:25:32.847793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.847779 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:32.850228 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850208 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 16:25:32.850428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850414 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.850520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850481 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-8dbb8fdd6-27rh2"] Apr 20 16:25:32.850520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850501 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-p5cj4\"" Apr 20 16:25:32.850626 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850552 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 16:25:32.850626 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:32.850782 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.850761 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.852798 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.852778 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 16:25:32.853065 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.853045 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.853142 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.853103 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.853248 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.853224 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.854454 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.854435 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7d49d\"" Apr 20 16:25:32.854454 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.854448 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 16:25:32.855471 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.855448 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 16:25:32.855622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.855527 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.855622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.855550 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 16:25:32.855957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.855940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 16:25:32.856131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.856118 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 16:25:32.856227 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.856205 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9q4g6\"" Apr 20 16:25:32.856304 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.856287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.861420 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.861400 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh"] Apr 20 16:25:32.862134 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.862110 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb"] Apr 20 16:25:32.862879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.862823 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8dbb8fdd6-27rh2"] Apr 20 16:25:32.948091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:32.948091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-stats-auth\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/164b3066-3171-4ff5-b023-f49f644b1d28-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q48w\" (UniqueName: \"kubernetes.io/projected/542e12c0-876b-401f-b987-efaf65039572-kube-api-access-5q48w\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2pb\" (UniqueName: \"kubernetes.io/projected/164b3066-3171-4ff5-b023-f49f644b1d28-kube-api-access-gx2pb\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:32.948342 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg5l9\" (UniqueName: \"kubernetes.io/projected/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-kube-api-access-jg5l9\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nxr9\" (UniqueName: \"kubernetes.io/projected/79391e4d-48bf-4541-a5e5-ad615c67502e-kube-api-access-7nxr9\") pod \"volume-data-source-validator-7c6cbb6c87-q5q8v\" (UID: \"79391e4d-48bf-4541-a5e5-ad615c67502e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-default-certificate\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:32.948481 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948519 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164b3066-3171-4ff5-b023-f49f644b1d28-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:32.948603 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:32.948547 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls podName:a9278d9f-91d1-433a-b1fa-73f5e63273b2 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:33.44852573 +0000 UTC m=+136.568702228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gbnfc" (UID: "a9278d9f-91d1-433a-b1fa-73f5e63273b2") : secret "samples-operator-tls" not found Apr 20 16:25:32.948847 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.948652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55vbd\" (UniqueName: \"kubernetes.io/projected/a9278d9f-91d1-433a-b1fa-73f5e63273b2-kube-api-access-55vbd\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.963180 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.963137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vbd\" (UniqueName: \"kubernetes.io/projected/a9278d9f-91d1-433a-b1fa-73f5e63273b2-kube-api-access-55vbd\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:32.963320 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:32.963191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nxr9\" (UniqueName: \"kubernetes.io/projected/79391e4d-48bf-4541-a5e5-ad615c67502e-kube-api-access-7nxr9\") pod \"volume-data-source-validator-7c6cbb6c87-q5q8v\" (UID: \"79391e4d-48bf-4541-a5e5-ad615c67502e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" Apr 20 16:25:33.049213 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164b3066-3171-4ff5-b023-f49f644b1d28-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.049213 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.049213 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-stats-auth\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.049213 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/164b3066-3171-4ff5-b023-f49f644b1d28-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q48w\" (UniqueName: \"kubernetes.io/projected/542e12c0-876b-401f-b987-efaf65039572-kube-api-access-5q48w\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049302 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx2pb\" (UniqueName: \"kubernetes.io/projected/164b3066-3171-4ff5-b023-f49f644b1d28-kube-api-access-gx2pb\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.049313 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg5l9\" (UniqueName: \"kubernetes.io/projected/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-kube-api-access-jg5l9\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.049381 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:33.549360185 +0000 UTC m=+136.669536705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:33.049527 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-default-certificate\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.049934 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.049463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.050241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164b3066-3171-4ff5-b023-f49f644b1d28-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.050486 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.051129 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:25:33.550955088 +0000 UTC m=+136.671131678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.051416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.051460 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:33.551440299 +0000 UTC m=+136.671616822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:33.054375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.053433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/164b3066-3171-4ff5-b023-f49f644b1d28-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.055308 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.054909 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-default-certificate\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.056453 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.056431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-stats-auth\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.067369 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.067347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q48w\" (UniqueName: \"kubernetes.io/projected/542e12c0-876b-401f-b987-efaf65039572-kube-api-access-5q48w\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.067489 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.067436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg5l9\" (UniqueName: \"kubernetes.io/projected/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-kube-api-access-jg5l9\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.068088 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.068067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx2pb\" (UniqueName: \"kubernetes.io/projected/164b3066-3171-4ff5-b023-f49f644b1d28-kube-api-access-gx2pb\") pod \"kube-storage-version-migrator-operator-6769c5d45-26lqh\" (UID: \"164b3066-3171-4ff5-b023-f49f644b1d28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.158235 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.158205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" Apr 20 16:25:33.159961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.159933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v"] Apr 20 16:25:33.164330 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:25:33.164305 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79391e4d_48bf_4541_a5e5_ad615c67502e.slice/crio-f6d69689e80381321ff9ad49421b5982f70f67f82fb4a612b29b43b6fdee072f WatchSource:0}: Error finding container f6d69689e80381321ff9ad49421b5982f70f67f82fb4a612b29b43b6fdee072f: Status 404 returned error can't find the container with id f6d69689e80381321ff9ad49421b5982f70f67f82fb4a612b29b43b6fdee072f Apr 20 16:25:33.275906 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.275873 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh"] Apr 20 16:25:33.278825 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:25:33.278797 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164b3066_3171_4ff5_b023_f49f644b1d28.slice/crio-ec46028d45870e0778c0ef50f5108a565603e7496edcda73b1823b310928a060 WatchSource:0}: Error finding container ec46028d45870e0778c0ef50f5108a565603e7496edcda73b1823b310928a060: Status 404 returned error can't find the container with id ec46028d45870e0778c0ef50f5108a565603e7496edcda73b1823b310928a060 Apr 20 16:25:33.453002 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.452971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:33.453229 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.453097 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 16:25:33.453229 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.453190 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls podName:a9278d9f-91d1-433a-b1fa-73f5e63273b2 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:34.453144826 +0000 UTC m=+137.573321324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gbnfc" (UID: "a9278d9f-91d1-433a-b1fa-73f5e63273b2") : secret "samples-operator-tls" not found Apr 20 16:25:33.553532 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.553497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.553741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.553571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:33.553741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.553601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:33.553741 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.553686 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:34.553663524 +0000 UTC m=+137.673840027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:33.553741 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.553714 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:33.553741 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.553718 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:33.553990 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.553772 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:34.553758042 +0000 UTC m=+137.673934562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:33.553990 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:33.553796 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:25:34.553779948 +0000 UTC m=+137.673956462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:33.878562 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.878516 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" event={"ID":"79391e4d-48bf-4541-a5e5-ad615c67502e","Type":"ContainerStarted","Data":"f6d69689e80381321ff9ad49421b5982f70f67f82fb4a612b29b43b6fdee072f"} Apr 20 16:25:33.879862 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:33.879822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" event={"ID":"164b3066-3171-4ff5-b023-f49f644b1d28","Type":"ContainerStarted","Data":"ec46028d45870e0778c0ef50f5108a565603e7496edcda73b1823b310928a060"} Apr 20 16:25:34.461729 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.461691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:34.461922 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.461881 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 16:25:34.461990 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.461962 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls podName:a9278d9f-91d1-433a-b1fa-73f5e63273b2 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:36.461939377 +0000 UTC m=+139.582115878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gbnfc" (UID: "a9278d9f-91d1-433a-b1fa-73f5e63273b2") : secret "samples-operator-tls" not found Apr 20 16:25:34.563082 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.563030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:34.563279 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.563132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:34.563279 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.563188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:34.563279 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.563241 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:36.563217163 +0000 UTC m=+139.683393670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:34.563452 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.563287 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:34.563452 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.563287 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:34.563452 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.563338 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:36.563326469 +0000 UTC m=+139.683502972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:34.563452 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:34.563352 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:25:36.563346405 +0000 UTC m=+139.683522902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:34.882952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.882908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" event={"ID":"79391e4d-48bf-4541-a5e5-ad615c67502e","Type":"ContainerStarted","Data":"148e1133e165e818de0d4ed93275f30be0f29a3c48b38094e72d72a6fdf8a31f"} Apr 20 16:25:34.897842 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:34.897794 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q5q8v" podStartSLOduration=1.442877932 podStartE2EDuration="2.897779202s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:33.165874296 +0000 UTC m=+136.286050794" lastFinishedPulling="2026-04-20 16:25:34.620775566 +0000 UTC m=+137.740952064" observedRunningTime="2026-04-20 16:25:34.897554448 +0000 UTC m=+138.017730979" watchObservedRunningTime="2026-04-20 16:25:34.897779202 +0000 UTC m=+138.017955722" Apr 20 16:25:35.886357 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:35.886319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" event={"ID":"164b3066-3171-4ff5-b023-f49f644b1d28","Type":"ContainerStarted","Data":"eb3d075aacf7f6400164f0122f8792c74e49a6149c18dee608eef6b4b0f1bfde"} Apr 20 16:25:35.902474 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:35.902417 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" podStartSLOduration=1.449844317 podStartE2EDuration="3.902398039s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:33.280663101 +0000 UTC m=+136.400839600" lastFinishedPulling="2026-04-20 16:25:35.733216821 +0000 UTC m=+138.853393322" observedRunningTime="2026-04-20 16:25:35.901415276 +0000 UTC m=+139.021591796" watchObservedRunningTime="2026-04-20 16:25:35.902398039 +0000 UTC m=+139.022574558" Apr 20 16:25:36.480371 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:36.480316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:36.480546 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.480470 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 16:25:36.480546 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.480541 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls podName:a9278d9f-91d1-433a-b1fa-73f5e63273b2 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:40.48052263 +0000 UTC m=+143.600699147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gbnfc" (UID: "a9278d9f-91d1-433a-b1fa-73f5e63273b2") : secret "samples-operator-tls" not found Apr 20 16:25:36.580763 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:36.580711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:36.580957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:36.580808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:36.580957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:36.580840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:36.580957 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.580881 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:40.580861566 +0000 UTC m=+143.701038084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:36.580957 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.580931 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:36.581122 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.580964 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:36.581122 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.580969 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:25:40.580961769 +0000 UTC m=+143.701138266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:36.581122 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:36.581039 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:40.581022749 +0000 UTC m=+143.701199260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:40.047421 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.047390 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pmgvs"] Apr 20 16:25:40.050524 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.050508 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.053333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.053309 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 16:25:40.053447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.053346 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 16:25:40.053447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.053362 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 16:25:40.053447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.053348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 16:25:40.054448 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.054434 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-b2tss\"" Apr 20 16:25:40.056963 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.056872 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pmgvs"] Apr 20 16:25:40.209450 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.209416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-key\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.209450 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.209449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88x8\" (UniqueName: \"kubernetes.io/projected/08bf7a79-1342-4516-8bd4-ece84345feb1-kube-api-access-r88x8\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.209666 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.209479 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-cabundle\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.310299 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.310210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-key\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.310299 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.310252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r88x8\" (UniqueName: \"kubernetes.io/projected/08bf7a79-1342-4516-8bd4-ece84345feb1-kube-api-access-r88x8\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.310299 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.310284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-cabundle\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.310866 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.310841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-cabundle\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.312751 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.312730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08bf7a79-1342-4516-8bd4-ece84345feb1-signing-key\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.318724 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.318704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88x8\" (UniqueName: \"kubernetes.io/projected/08bf7a79-1342-4516-8bd4-ece84345feb1-kube-api-access-r88x8\") pod \"service-ca-865cb79987-pmgvs\" (UID: \"08bf7a79-1342-4516-8bd4-ece84345feb1\") " pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.359690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.359666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pmgvs" Apr 20 16:25:40.425303 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.425275 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkvft_2e64cb9b-1c5d-4de7-9ce4-b673e8576d87/dns-node-resolver/0.log" Apr 20 16:25:40.476463 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.476431 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pmgvs"] Apr 20 16:25:40.480220 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:25:40.480189 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08bf7a79_1342_4516_8bd4_ece84345feb1.slice/crio-59aad072818d9be8c45743b18fe847ca4c1ab8bc84f24c7bcc0e6da1f07b7f32 WatchSource:0}: Error finding container 59aad072818d9be8c45743b18fe847ca4c1ab8bc84f24c7bcc0e6da1f07b7f32: Status 404 returned error can't find the container with id 59aad072818d9be8c45743b18fe847ca4c1ab8bc84f24c7bcc0e6da1f07b7f32 Apr 20 16:25:40.512254 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.512226 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:40.512409 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.512390 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 16:25:40.512491 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.512480 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls podName:a9278d9f-91d1-433a-b1fa-73f5e63273b2 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:48.512458682 +0000 UTC m=+151.632635195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gbnfc" (UID: "a9278d9f-91d1-433a-b1fa-73f5e63273b2") : secret "samples-operator-tls" not found Apr 20 16:25:40.612865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.612835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:40.613027 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.612879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:40.613027 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.612958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:40.613027 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.612992 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:40.613211 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.613060 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:48.613041707 +0000 UTC m=+151.733218219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:40.613211 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.613075 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:48.613069213 +0000 UTC m=+151.733245711 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:40.613211 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.613085 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:40.613211 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:40.613151 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:25:48.613133484 +0000 UTC m=+151.733309996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:40.897864 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:40.897774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pmgvs" event={"ID":"08bf7a79-1342-4516-8bd4-ece84345feb1","Type":"ContainerStarted","Data":"59aad072818d9be8c45743b18fe847ca4c1ab8bc84f24c7bcc0e6da1f07b7f32"} Apr 20 16:25:41.426254 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:41.426224 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cklt7_842542e9-94b5-494f-8110-018afb1c0a5f/node-ca/0.log" Apr 20 16:25:42.902954 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:42.902916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pmgvs" event={"ID":"08bf7a79-1342-4516-8bd4-ece84345feb1","Type":"ContainerStarted","Data":"5fb4ba77c8176d3b979d7525854d25aec928cb6d802c0a55db6ecbd03456e545"} Apr 20 16:25:42.920667 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:42.920619 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pmgvs" podStartSLOduration=1.118502814 podStartE2EDuration="2.920604356s" podCreationTimestamp="2026-04-20 16:25:40 +0000 UTC" firstStartedPulling="2026-04-20 16:25:40.482095841 +0000 UTC m=+143.602272341" lastFinishedPulling="2026-04-20 16:25:42.284197385 +0000 UTC m=+145.404373883" observedRunningTime="2026-04-20 16:25:42.920374705 +0000 UTC m=+146.040551226" watchObservedRunningTime="2026-04-20 16:25:42.920604356 +0000 UTC m=+146.040780905" Apr 20 16:25:43.227702 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:43.227620 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-26lqh_164b3066-3171-4ff5-b023-f49f644b1d28/kube-storage-version-migrator-operator/0.log" Apr 20 16:25:48.578715 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.578684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:48.581533 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.581489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9278d9f-91d1-433a-b1fa-73f5e63273b2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gbnfc\" (UID: \"a9278d9f-91d1-433a-b1fa-73f5e63273b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:48.655103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.655060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" Apr 20 16:25:48.680098 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.680066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:48.680291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.680122 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:25:48.680291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.680143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:25:48.680291 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:48.680262 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:48.680291 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:48.680264 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 16:25:48.680498 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:48.680300 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:26:04.680279429 +0000 UTC m=+167.800455944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : configmap references non-existent config key: service-ca.crt Apr 20 16:25:48.680498 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:48.680326 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls podName:b2a0a168-de3a-4df1-89e1-c8d831ef5ada nodeName:}" failed. No retries permitted until 2026-04-20 16:26:04.680314751 +0000 UTC m=+167.800491248 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqblb" (UID: "b2a0a168-de3a-4df1-89e1-c8d831ef5ada") : secret "cluster-monitoring-operator-tls" not found Apr 20 16:25:48.680498 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:48.680349 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs podName:542e12c0-876b-401f-b987-efaf65039572 nodeName:}" failed. No retries permitted until 2026-04-20 16:26:04.680340101 +0000 UTC m=+167.800516601 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs") pod "router-default-8dbb8fdd6-27rh2" (UID: "542e12c0-876b-401f-b987-efaf65039572") : secret "router-metrics-certs-default" not found Apr 20 16:25:48.775103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.775074 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc"] Apr 20 16:25:48.915882 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:48.915842 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" event={"ID":"a9278d9f-91d1-433a-b1fa-73f5e63273b2","Type":"ContainerStarted","Data":"cdf266305b699badc028ae4a7beafdf7afd58ec0bb06a9cbaae259d7aec60c2e"} Apr 20 16:25:50.922488 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:50.922461 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" event={"ID":"a9278d9f-91d1-433a-b1fa-73f5e63273b2","Type":"ContainerStarted","Data":"bb3a8d7495d294bcca56ffffafef08a19e7165f83e1b9cce12528841116426e9"} Apr 20 16:25:51.852295 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:51.852250 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mmqjp" podUID="1d3f1231-a687-4b5c-b5b5-d078c34b831b" Apr 20 16:25:51.871343 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:51.871304 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-np9bb" podUID="6da97721-f2ed-4061-b7a4-2577d2b33d11" Apr 20 16:25:51.926337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:51.926300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" event={"ID":"a9278d9f-91d1-433a-b1fa-73f5e63273b2","Type":"ContainerStarted","Data":"e4359b237b9fec7db57f3a78ed7eeda705c2a54663ac41eccd40bb364de854bb"} Apr 20 16:25:51.926337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:51.926337 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmqjp" Apr 20 16:25:51.942910 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:51.942854 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gbnfc" podStartSLOduration=17.910926725 podStartE2EDuration="19.942842369s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:48.817895676 +0000 UTC m=+151.938072178" lastFinishedPulling="2026-04-20 16:25:50.849811307 +0000 UTC m=+153.969987822" observedRunningTime="2026-04-20 16:25:51.941196103 +0000 UTC m=+155.061372619" watchObservedRunningTime="2026-04-20 16:25:51.942842369 +0000 UTC m=+155.063018889" Apr 20 16:25:53.438331 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:25:53.438290 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rxwd9" podUID="ff512ace-f73c-4265-890e-b43c9ecc782d" Apr 20 16:25:56.747724 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:56.747681 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:25:56.748117 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:56.747747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:25:56.750179 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:56.750144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d3f1231-a687-4b5c-b5b5-d078c34b831b-metrics-tls\") pod \"dns-default-mmqjp\" (UID: \"1d3f1231-a687-4b5c-b5b5-d078c34b831b\") " pod="openshift-dns/dns-default-mmqjp" Apr 20 16:25:56.750297 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:56.750249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6da97721-f2ed-4061-b7a4-2577d2b33d11-cert\") pod \"ingress-canary-np9bb\" (UID: \"6da97721-f2ed-4061-b7a4-2577d2b33d11\") " pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:25:57.029118 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:57.029037 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:25:57.037222 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:57.037204 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mmqjp" Apr 20 16:25:57.152370 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:57.152333 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mmqjp"] Apr 20 16:25:57.155955 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:25:57.155928 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3f1231_a687_4b5c_b5b5_d078c34b831b.slice/crio-6804b1ff529ff6cfe0bef8fbb36cc88ed84e27f66b0f5eb61897552975ad4c08 WatchSource:0}: Error finding container 6804b1ff529ff6cfe0bef8fbb36cc88ed84e27f66b0f5eb61897552975ad4c08: Status 404 returned error can't find the container with id 6804b1ff529ff6cfe0bef8fbb36cc88ed84e27f66b0f5eb61897552975ad4c08 Apr 20 16:25:57.943735 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:57.943695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmqjp" event={"ID":"1d3f1231-a687-4b5c-b5b5-d078c34b831b","Type":"ContainerStarted","Data":"6804b1ff529ff6cfe0bef8fbb36cc88ed84e27f66b0f5eb61897552975ad4c08"} Apr 20 16:25:58.948212 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:58.948150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmqjp" event={"ID":"1d3f1231-a687-4b5c-b5b5-d078c34b831b","Type":"ContainerStarted","Data":"d899a18510aa5e3a4199c9bae775dd6685290d39b15c724b067c1b82a9993772"} Apr 20 16:25:58.948212 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:58.948216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mmqjp" event={"ID":"1d3f1231-a687-4b5c-b5b5-d078c34b831b","Type":"ContainerStarted","Data":"63969e040c98ec39f60c79c9ae2e698b29c14badffefb6b1aca77c78f611e275"} Apr 20 16:25:58.948744 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:58.948414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mmqjp" Apr 20 16:25:58.965718 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:25:58.965672 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mmqjp" podStartSLOduration=129.747702885 podStartE2EDuration="2m10.96565906s" podCreationTimestamp="2026-04-20 16:23:48 +0000 UTC" firstStartedPulling="2026-04-20 16:25:57.158331797 +0000 UTC m=+160.278508295" lastFinishedPulling="2026-04-20 16:25:58.376287972 +0000 UTC m=+161.496464470" observedRunningTime="2026-04-20 16:25:58.964370584 +0000 UTC m=+162.084547105" watchObservedRunningTime="2026-04-20 16:25:58.96565906 +0000 UTC m=+162.085835579" Apr 20 16:26:01.232459 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.232424 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d4rs6"] Apr 20 16:26:01.236734 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.236710 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.240435 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.240406 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 16:26:01.240435 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.240431 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 16:26:01.240632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.240461 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lgcvb\"" Apr 20 16:26:01.240632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.240461 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 16:26:01.240632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.240407 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 16:26:01.246982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.246960 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d4rs6"] Apr 20 16:26:01.278586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.278558 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-c886g"] Apr 20 16:26:01.281443 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.281422 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:01.282472 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.282449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-crio-socket\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.282544 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.282479 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tgd\" (UniqueName: \"kubernetes.io/projected/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-api-access-82tgd\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.282581 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.282549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.282581 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.282572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-data-volume\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.282675 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.282619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.283839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.283822 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 16:26:01.284275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.284254 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-cqx6p\"" Apr 20 16:26:01.284370 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.284345 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 16:26:01.292580 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.292557 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c886g"] Apr 20 16:26:01.383046 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383092 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-crio-socket\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82tgd\" (UniqueName: \"kubernetes.io/projected/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-api-access-82tgd\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383158 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk46n\" (UniqueName: \"kubernetes.io/projected/671503df-d3e2-439c-8805-f38f49057176-kube-api-access-dk46n\") pod \"downloads-6bcc868b7-c886g\" (UID: \"671503df-d3e2-439c-8805-f38f49057176\") " pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:01.383250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383250 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-data-volume\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-crio-socket\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383506 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-data-volume\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.383694 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.383678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.385760 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.385740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.392491 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.392461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tgd\" (UniqueName: \"kubernetes.io/projected/cf89efb4-eb7e-40aa-b0ad-4e4a47685ece-kube-api-access-82tgd\") pod \"insights-runtime-extractor-d4rs6\" (UID: \"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece\") " pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.484333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.484251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk46n\" (UniqueName: \"kubernetes.io/projected/671503df-d3e2-439c-8805-f38f49057176-kube-api-access-dk46n\") pod \"downloads-6bcc868b7-c886g\" (UID: \"671503df-d3e2-439c-8805-f38f49057176\") " pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:01.494132 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.494106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk46n\" (UniqueName: \"kubernetes.io/projected/671503df-d3e2-439c-8805-f38f49057176-kube-api-access-dk46n\") pod \"downloads-6bcc868b7-c886g\" (UID: \"671503df-d3e2-439c-8805-f38f49057176\") " pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:01.546770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.546738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d4rs6" Apr 20 16:26:01.589918 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.589824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:01.669890 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.669859 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d4rs6"] Apr 20 16:26:01.673740 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:01.673715 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf89efb4_eb7e_40aa_b0ad_4e4a47685ece.slice/crio-dda746a775a14a688b3ffcc1afcafbb9bd8c02cd2604a0e24b36188e3788b640 WatchSource:0}: Error finding container dda746a775a14a688b3ffcc1afcafbb9bd8c02cd2604a0e24b36188e3788b640: Status 404 returned error can't find the container with id dda746a775a14a688b3ffcc1afcafbb9bd8c02cd2604a0e24b36188e3788b640 Apr 20 16:26:01.716445 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.716416 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c886g"] Apr 20 16:26:01.727147 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:01.727122 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671503df_d3e2_439c_8805_f38f49057176.slice/crio-90904d5a26d9c7084f802a5bc311e80a49b322612b8f0f3ed2346c8c6159921b WatchSource:0}: Error finding container 90904d5a26d9c7084f802a5bc311e80a49b322612b8f0f3ed2346c8c6159921b: Status 404 returned error can't find the container with id 90904d5a26d9c7084f802a5bc311e80a49b322612b8f0f3ed2346c8c6159921b Apr 20 16:26:01.961086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.961049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d4rs6" event={"ID":"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece","Type":"ContainerStarted","Data":"86d1cc848791d200f99041938b1b1c65b761b4e9c9b0214862e9ce30e6c7c3fd"} Apr 20 16:26:01.961086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.961088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d4rs6" event={"ID":"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece","Type":"ContainerStarted","Data":"dda746a775a14a688b3ffcc1afcafbb9bd8c02cd2604a0e24b36188e3788b640"} Apr 20 16:26:01.961921 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:01.961898 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c886g" event={"ID":"671503df-d3e2-439c-8805-f38f49057176","Type":"ContainerStarted","Data":"90904d5a26d9c7084f802a5bc311e80a49b322612b8f0f3ed2346c8c6159921b"} Apr 20 16:26:02.966210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:02.966096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d4rs6" event={"ID":"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece","Type":"ContainerStarted","Data":"def4a48fe866dddb78ddc0a7fd152516f25c11808ca877751bf42c528ce95df4"} Apr 20 16:26:03.972403 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:03.972356 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d4rs6" event={"ID":"cf89efb4-eb7e-40aa-b0ad-4e4a47685ece","Type":"ContainerStarted","Data":"b71b2a78ac31faceb63715ec1574df3bd6141d31f1b05daab4cb6907b9e7e05e"} Apr 20 16:26:03.990344 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:03.990294 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d4rs6" podStartSLOduration=0.888431762 podStartE2EDuration="2.9902804s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="2026-04-20 16:26:01.745546141 +0000 UTC m=+164.865722638" lastFinishedPulling="2026-04-20 16:26:03.847394779 +0000 UTC m=+166.967571276" observedRunningTime="2026-04-20 16:26:03.989420356 +0000 UTC m=+167.109596893" watchObservedRunningTime="2026-04-20 16:26:03.9902804 +0000 UTC m=+167.110456920" Apr 20 16:26:04.708115 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.708074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:04.708332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.708158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:04.708332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.708218 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:26:04.708784 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.708761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/542e12c0-876b-401f-b987-efaf65039572-service-ca-bundle\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:04.711209 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.711188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/542e12c0-876b-401f-b987-efaf65039572-metrics-certs\") pod \"router-default-8dbb8fdd6-27rh2\" (UID: \"542e12c0-876b-401f-b987-efaf65039572\") " pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:04.711313 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.711244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a0a168-de3a-4df1-89e1-c8d831ef5ada-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqblb\" (UID: \"b2a0a168-de3a-4df1-89e1-c8d831ef5ada\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:26:04.965872 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.965789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" Apr 20 16:26:04.971192 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:04.971148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:05.118320 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.118286 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb"] Apr 20 16:26:05.121592 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:05.121566 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a0a168_de3a_4df1_89e1_c8d831ef5ada.slice/crio-f248d588498f811a841d2f00be69213576f6464d5c4fdc7e6b7811b80933a4a4 WatchSource:0}: Error finding container f248d588498f811a841d2f00be69213576f6464d5c4fdc7e6b7811b80933a4a4: Status 404 returned error can't find the container with id f248d588498f811a841d2f00be69213576f6464d5c4fdc7e6b7811b80933a4a4 Apr 20 16:26:05.138272 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.138241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-8dbb8fdd6-27rh2"] Apr 20 16:26:05.141463 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:05.141431 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542e12c0_876b_401f_b987_efaf65039572.slice/crio-8e96c703a912aa3623d1510e825c2136195b9bb9428a6c829b363341f2ed4b16 WatchSource:0}: Error finding container 8e96c703a912aa3623d1510e825c2136195b9bb9428a6c829b363341f2ed4b16: Status 404 returned error can't find the container with id 8e96c703a912aa3623d1510e825c2136195b9bb9428a6c829b363341f2ed4b16 Apr 20 16:26:05.417958 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.417918 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:26:05.420754 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.420732 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:26:05.428856 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.428833 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-np9bb" Apr 20 16:26:05.563443 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.563266 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-np9bb"] Apr 20 16:26:05.566050 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:05.566010 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da97721_f2ed_4061_b7a4_2577d2b33d11.slice/crio-5fa4a30c5679d8a483944362583e0f27101b7b32306604cc14587aeb6e3f49c9 WatchSource:0}: Error finding container 5fa4a30c5679d8a483944362583e0f27101b7b32306604cc14587aeb6e3f49c9: Status 404 returned error can't find the container with id 5fa4a30c5679d8a483944362583e0f27101b7b32306604cc14587aeb6e3f49c9 Apr 20 16:26:05.979327 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.979289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-np9bb" event={"ID":"6da97721-f2ed-4061-b7a4-2577d2b33d11","Type":"ContainerStarted","Data":"5fa4a30c5679d8a483944362583e0f27101b7b32306604cc14587aeb6e3f49c9"} Apr 20 16:26:05.981408 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.981360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" event={"ID":"542e12c0-876b-401f-b987-efaf65039572","Type":"ContainerStarted","Data":"ce992991a8096294708a08821ed6fc2b37ac7ea1ab0b4a737cc9eae9e398b627"} Apr 20 16:26:05.981644 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.981601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" event={"ID":"542e12c0-876b-401f-b987-efaf65039572","Type":"ContainerStarted","Data":"8e96c703a912aa3623d1510e825c2136195b9bb9428a6c829b363341f2ed4b16"} Apr 20 16:26:05.983258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:05.983203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" event={"ID":"b2a0a168-de3a-4df1-89e1-c8d831ef5ada","Type":"ContainerStarted","Data":"f248d588498f811a841d2f00be69213576f6464d5c4fdc7e6b7811b80933a4a4"} Apr 20 16:26:06.001812 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.001762 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" podStartSLOduration=34.001743892 podStartE2EDuration="34.001743892s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:26:06.000031329 +0000 UTC m=+169.120207899" watchObservedRunningTime="2026-04-20 16:26:06.001743892 +0000 UTC m=+169.121920407" Apr 20 16:26:06.458860 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.457946 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:06.461362 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.461337 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.464097 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.463728 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 16:26:06.464097 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.463779 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mqch4\"" Apr 20 16:26:06.465097 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.465076 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 16:26:06.465229 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.465077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 16:26:06.465229 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.465130 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 16:26:06.465381 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.465086 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 16:26:06.473289 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.473236 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:06.524857 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.524822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.525044 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.524875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn629\" (UniqueName: \"kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.525044 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.524902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.525044 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.524963 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.525044 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.524998 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.525044 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.525032 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn629\" (UniqueName: \"kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.625964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.626833 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.626773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.627029 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.627007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.627472 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.627403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.629481 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.629433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.631733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.631664 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.636080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.636056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn629\" (UniqueName: \"kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629\") pod \"console-8464968566-g6xz9\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.775867 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.775783 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:06.971854 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.971794 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:06.974890 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.974859 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:06.986143 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.986116 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:06.987586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:06.987567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-8dbb8fdd6-27rh2" Apr 20 16:26:07.920648 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:07.920081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:07.991511 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:07.991473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-np9bb" event={"ID":"6da97721-f2ed-4061-b7a4-2577d2b33d11","Type":"ContainerStarted","Data":"ea3d9d6eaaf047864656cdbdd13e2b3003ce79319b6a3534783ca28558fdb2ac"} Apr 20 16:26:07.994717 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:07.994643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" event={"ID":"b2a0a168-de3a-4df1-89e1-c8d831ef5ada","Type":"ContainerStarted","Data":"35fea5fda83d37d1ca5607a536f037764e24e8f6c7f4832aaf145c4e0908116d"} Apr 20 16:26:07.998726 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:07.998682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8464968566-g6xz9" event={"ID":"a20be14d-583a-4f25-950e-23a88f6e512a","Type":"ContainerStarted","Data":"c391c95661066edffc34d42414746830e8121b580094685343741a54c9b3bfe9"} Apr 20 16:26:08.006878 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.006818 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-np9bb" podStartSLOduration=137.786426869 podStartE2EDuration="2m20.00680109s" podCreationTimestamp="2026-04-20 16:23:48 +0000 UTC" firstStartedPulling="2026-04-20 16:26:05.568297926 +0000 UTC m=+168.688474427" lastFinishedPulling="2026-04-20 16:26:07.788672134 +0000 UTC m=+170.908848648" observedRunningTime="2026-04-20 16:26:08.006772269 +0000 UTC m=+171.126948788" watchObservedRunningTime="2026-04-20 16:26:08.00680109 +0000 UTC m=+171.126977611" Apr 20 16:26:08.023758 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.023702 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqblb" podStartSLOduration=33.364378284 podStartE2EDuration="36.023682419s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:26:05.123933901 +0000 UTC m=+168.244110403" lastFinishedPulling="2026-04-20 16:26:07.783238033 +0000 UTC m=+170.903414538" observedRunningTime="2026-04-20 16:26:08.022915559 +0000 UTC m=+171.143092080" watchObservedRunningTime="2026-04-20 16:26:08.023682419 +0000 UTC m=+171.143858940" Apr 20 16:26:08.329944 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.329912 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm"] Apr 20 16:26:08.333452 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.333428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:08.335688 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.335665 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 16:26:08.335799 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.335774 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-x4wz9\"" Apr 20 16:26:08.340292 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.340270 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm"] Apr 20 16:26:08.418158 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.418124 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:26:08.444059 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.444025 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9ef807a0-8b32-44f3-97d4-07bfc892741a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6k8cm\" (UID: \"9ef807a0-8b32-44f3-97d4-07bfc892741a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:08.545091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.545042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9ef807a0-8b32-44f3-97d4-07bfc892741a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6k8cm\" (UID: \"9ef807a0-8b32-44f3-97d4-07bfc892741a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:08.548427 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.548373 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9ef807a0-8b32-44f3-97d4-07bfc892741a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6k8cm\" (UID: \"9ef807a0-8b32-44f3-97d4-07bfc892741a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:08.647449 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.647405 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:08.796576 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.796518 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm"] Apr 20 16:26:08.801542 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:08.801447 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef807a0_8b32_44f3_97d4_07bfc892741a.slice/crio-bfb34a5a24aeb28b92d4f743778fedb841a7766cc1155c596f975fdd3578bfdd WatchSource:0}: Error finding container bfb34a5a24aeb28b92d4f743778fedb841a7766cc1155c596f975fdd3578bfdd: Status 404 returned error can't find the container with id bfb34a5a24aeb28b92d4f743778fedb841a7766cc1155c596f975fdd3578bfdd Apr 20 16:26:08.953418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:08.953329 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mmqjp" Apr 20 16:26:09.003009 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:09.002972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" event={"ID":"9ef807a0-8b32-44f3-97d4-07bfc892741a","Type":"ContainerStarted","Data":"bfb34a5a24aeb28b92d4f743778fedb841a7766cc1155c596f975fdd3578bfdd"} Apr 20 16:26:10.008652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.008608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" event={"ID":"9ef807a0-8b32-44f3-97d4-07bfc892741a","Type":"ContainerStarted","Data":"0d79e76c5d67f4b5281e5ce702c69eabfb19e3c634ca6ced8338f0b4ccaec582"} Apr 20 16:26:10.009088 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.008793 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:10.014445 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.014420 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" Apr 20 16:26:10.023813 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.023760 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6k8cm" podStartSLOduration=0.994904157 podStartE2EDuration="2.023737797s" podCreationTimestamp="2026-04-20 16:26:08 +0000 UTC" firstStartedPulling="2026-04-20 16:26:08.803037419 +0000 UTC m=+171.923213922" lastFinishedPulling="2026-04-20 16:26:09.83187106 +0000 UTC m=+172.952047562" observedRunningTime="2026-04-20 16:26:10.023085502 +0000 UTC m=+173.143262023" watchObservedRunningTime="2026-04-20 16:26:10.023737797 +0000 UTC m=+173.143914323" Apr 20 16:26:10.393701 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.393664 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-czmjr"] Apr 20 16:26:10.396321 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.396295 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.399230 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.398941 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 16:26:10.399230 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.399001 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 16:26:10.399230 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.398945 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 16:26:10.400376 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.400294 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-lcwpz\"" Apr 20 16:26:10.403416 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.403371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-czmjr"] Apr 20 16:26:10.465296 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.465264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.465296 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.465305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466bg\" (UniqueName: \"kubernetes.io/projected/598a6714-5291-4598-8125-ab116843849d-kube-api-access-466bg\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.465534 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.465334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.465534 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.465413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/598a6714-5291-4598-8125-ab116843849d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.565886 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.565844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/598a6714-5291-4598-8125-ab116843849d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.566040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.565930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.566040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.565965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-466bg\" (UniqueName: \"kubernetes.io/projected/598a6714-5291-4598-8125-ab116843849d-kube-api-access-466bg\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.566040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.566002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.566426 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:10.566399 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 16:26:10.566544 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:10.566479 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls podName:598a6714-5291-4598-8125-ab116843849d nodeName:}" failed. No retries permitted until 2026-04-20 16:26:11.066457012 +0000 UTC m=+174.186633512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-czmjr" (UID: "598a6714-5291-4598-8125-ab116843849d") : secret "prometheus-operator-tls" not found Apr 20 16:26:10.566676 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.566654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/598a6714-5291-4598-8125-ab116843849d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.569220 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.569193 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:10.577583 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:10.575895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-466bg\" (UniqueName: \"kubernetes.io/projected/598a6714-5291-4598-8125-ab116843849d-kube-api-access-466bg\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:11.069408 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:11.069367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:11.072839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:11.072812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/598a6714-5291-4598-8125-ab116843849d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-czmjr\" (UID: \"598a6714-5291-4598-8125-ab116843849d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:11.309426 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:11.309383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" Apr 20 16:26:11.452076 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:11.452034 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-czmjr"] Apr 20 16:26:11.455430 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:11.455394 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598a6714_5291_4598_8125_ab116843849d.slice/crio-7824ee703f69a976bb4dce57d36333c2e4eb663d8d208db6f9c05306212a9cab WatchSource:0}: Error finding container 7824ee703f69a976bb4dce57d36333c2e4eb663d8d208db6f9c05306212a9cab: Status 404 returned error can't find the container with id 7824ee703f69a976bb4dce57d36333c2e4eb663d8d208db6f9c05306212a9cab Apr 20 16:26:12.016323 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:12.016238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8464968566-g6xz9" event={"ID":"a20be14d-583a-4f25-950e-23a88f6e512a","Type":"ContainerStarted","Data":"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297"} Apr 20 16:26:12.017545 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:12.017498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" event={"ID":"598a6714-5291-4598-8125-ab116843849d","Type":"ContainerStarted","Data":"7824ee703f69a976bb4dce57d36333c2e4eb663d8d208db6f9c05306212a9cab"} Apr 20 16:26:12.035063 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:12.035003 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8464968566-g6xz9" podStartSLOduration=2.773173827 podStartE2EDuration="6.034986231s" podCreationTimestamp="2026-04-20 16:26:06 +0000 UTC" firstStartedPulling="2026-04-20 16:26:07.927611298 +0000 UTC m=+171.047787796" lastFinishedPulling="2026-04-20 16:26:11.189423689 +0000 UTC m=+174.309600200" observedRunningTime="2026-04-20 16:26:12.03347867 +0000 UTC m=+175.153655191" watchObservedRunningTime="2026-04-20 16:26:12.034986231 +0000 UTC m=+175.155162754" Apr 20 16:26:16.776983 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:16.776943 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:16.776983 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:16.776990 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:16.783244 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:16.783215 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:17.036638 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:17.036556 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:20.042414 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.042366 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c886g" event={"ID":"671503df-d3e2-439c-8805-f38f49057176","Type":"ContainerStarted","Data":"8d78b2d2e373fb0239c1e3da2274f03fb177647ff98d26556c466b5300c13773"} Apr 20 16:26:20.042865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.042707 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:20.044474 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.044418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" event={"ID":"598a6714-5291-4598-8125-ab116843849d","Type":"ContainerStarted","Data":"768dfd65778782aadb8dd96c7d7c0979ba166a01a45535e9a7fa5b534788ccd0"} Apr 20 16:26:20.044474 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.044453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" event={"ID":"598a6714-5291-4598-8125-ab116843849d","Type":"ContainerStarted","Data":"b50e629f87a921b7c9330c1d4ca30ead8f691703d2317186485eeb2e46879392"} Apr 20 16:26:20.055780 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.055756 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-c886g" Apr 20 16:26:20.065737 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.065694 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-c886g" podStartSLOduration=1.264407611 podStartE2EDuration="19.065680203s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="2026-04-20 16:26:01.728977419 +0000 UTC m=+164.849153916" lastFinishedPulling="2026-04-20 16:26:19.530250007 +0000 UTC m=+182.650426508" observedRunningTime="2026-04-20 16:26:20.063528945 +0000 UTC m=+183.183705466" watchObservedRunningTime="2026-04-20 16:26:20.065680203 +0000 UTC m=+183.185856735" Apr 20 16:26:20.101291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:20.101227 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-czmjr" podStartSLOduration=2.076617795 podStartE2EDuration="10.101207427s" podCreationTimestamp="2026-04-20 16:26:10 +0000 UTC" firstStartedPulling="2026-04-20 16:26:11.458818212 +0000 UTC m=+174.578994709" lastFinishedPulling="2026-04-20 16:26:19.483407839 +0000 UTC m=+182.603584341" observedRunningTime="2026-04-20 16:26:20.099004858 +0000 UTC m=+183.219181396" watchObservedRunningTime="2026-04-20 16:26:20.101207427 +0000 UTC m=+183.221383947" Apr 20 16:26:21.756982 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.755417 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kh54r"] Apr 20 16:26:21.782003 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.781966 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wgtbk"] Apr 20 16:26:21.799815 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.799781 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wgtbk"] Apr 20 16:26:21.800045 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.799984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.800045 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.799994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.803066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.803032 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804487 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804511 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pd26p\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804661 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804685 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804801 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 16:26:21.804961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.804871 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vx69j\"" Apr 20 16:26:21.862708 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-metrics-client-ca\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfk8w\" (UniqueName: \"kubernetes.io/projected/ecdbf87a-e49b-4914-9b96-abd064658c90-kube-api-access-pfk8w\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-textfile\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-root\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.862861 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-sys\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.862996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-accelerators-collector-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.863102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-wtmp\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.863180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/455b95ec-d6c4-4986-99be-0bf8c2e95935-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.863210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.863237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dsn\" (UniqueName: \"kubernetes.io/projected/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-api-access-68dsn\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.863291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.863267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964389 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.964552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-metrics-client-ca\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfk8w\" (UniqueName: \"kubernetes.io/projected/ecdbf87a-e49b-4914-9b96-abd064658c90-kube-api-access-pfk8w\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-textfile\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964699 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.964699 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-root\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964699 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.964699 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-sys\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964893 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-sys\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964893 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-root\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964893 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.964893 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-accelerators-collector-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-wtmp\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/455b95ec-d6c4-4986-99be-0bf8c2e95935-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.964982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-textfile\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68dsn\" (UniqueName: \"kubernetes.io/projected/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-api-access-68dsn\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.965077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965391 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-wtmp\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965391 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.965391 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:21.965275 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 16:26:21.965391 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:21.965339 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls podName:ecdbf87a-e49b-4914-9b96-abd064658c90 nodeName:}" failed. No retries permitted until 2026-04-20 16:26:22.465316587 +0000 UTC m=+185.585493102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls") pod "node-exporter-kh54r" (UID: "ecdbf87a-e49b-4914-9b96-abd064658c90") : secret "node-exporter-tls" not found Apr 20 16:26:21.965391 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.965739 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-accelerators-collector-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.965960 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.965937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/455b95ec-d6c4-4986-99be-0bf8c2e95935-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.968622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.968598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.968830 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.968808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.969732 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.969608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.972217 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.972192 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdbf87a-e49b-4914-9b96-abd064658c90-metrics-client-ca\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:21.980204 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.980144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dsn\" (UniqueName: \"kubernetes.io/projected/455b95ec-d6c4-4986-99be-0bf8c2e95935-kube-api-access-68dsn\") pod \"kube-state-metrics-69db897b98-wgtbk\" (UID: \"455b95ec-d6c4-4986-99be-0bf8c2e95935\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:21.981682 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:21.981640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfk8w\" (UniqueName: \"kubernetes.io/projected/ecdbf87a-e49b-4914-9b96-abd064658c90-kube-api-access-pfk8w\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:22.124102 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.124069 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" Apr 20 16:26:22.291091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.291040 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-wgtbk"] Apr 20 16:26:22.293925 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:22.293888 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod455b95ec_d6c4_4986_99be_0bf8c2e95935.slice/crio-69018ca04459aa595ae7142ff638bd83071c2da2b003a6b8a577b265f1d0d4de WatchSource:0}: Error finding container 69018ca04459aa595ae7142ff638bd83071c2da2b003a6b8a577b265f1d0d4de: Status 404 returned error can't find the container with id 69018ca04459aa595ae7142ff638bd83071c2da2b003a6b8a577b265f1d0d4de Apr 20 16:26:22.469534 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.469436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:22.472388 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.472358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecdbf87a-e49b-4914-9b96-abd064658c90-node-exporter-tls\") pod \"node-exporter-kh54r\" (UID: \"ecdbf87a-e49b-4914-9b96-abd064658c90\") " pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:22.737839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.737761 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kh54r" Apr 20 16:26:22.753923 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:22.753884 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecdbf87a_e49b_4914_9b96_abd064658c90.slice/crio-efd8079498bbc00ce568177c55a7253f9951c1fd55198dad08073c03ca73d07e WatchSource:0}: Error finding container efd8079498bbc00ce568177c55a7253f9951c1fd55198dad08073c03ca73d07e: Status 404 returned error can't find the container with id efd8079498bbc00ce568177c55a7253f9951c1fd55198dad08073c03ca73d07e Apr 20 16:26:22.855184 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.855124 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 16:26:22.877503 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.877457 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 16:26:22.877683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.877642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.880670 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.880508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 16:26:22.880670 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.880520 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 16:26:22.880670 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.880612 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 16:26:22.880943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.880727 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 16:26:22.880943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.880830 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 16:26:22.881557 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.881272 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gxmzj\"" Apr 20 16:26:22.881557 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.881296 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 16:26:22.881557 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.881371 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 16:26:22.881557 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.881446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 16:26:22.881557 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.881553 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 16:26:22.974632 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.974841 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.974841 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.974841 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.974841 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-volume\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-out\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvd2j\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-kube-api-access-nvd2j\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.974965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975050 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.975001 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.975075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:22.975360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:22.975093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-web-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.054865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.054780 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kh54r" event={"ID":"ecdbf87a-e49b-4914-9b96-abd064658c90","Type":"ContainerStarted","Data":"efd8079498bbc00ce568177c55a7253f9951c1fd55198dad08073c03ca73d07e"} Apr 20 16:26:23.056309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.056286 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" event={"ID":"455b95ec-d6c4-4986-99be-0bf8c2e95935","Type":"ContainerStarted","Data":"69018ca04459aa595ae7142ff638bd83071c2da2b003a6b8a577b265f1d0d4de"} Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.075933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-out\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.075974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvd2j\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-kube-api-access-nvd2j\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-web-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076386 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076908 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076908 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076908 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.076908 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.076638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-volume\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.078210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.077836 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.080230 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.079673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-out\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.080230 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:23.079801 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 16:26:23.080230 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:23.079861 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls podName:55422fe6-0d03-4e20-95b2-44fa62edfdba nodeName:}" failed. No retries permitted until 2026-04-20 16:26:23.579840209 +0000 UTC m=+186.700016713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "55422fe6-0d03-4e20-95b2-44fa62edfdba") : secret "alertmanager-main-tls" not found Apr 20 16:26:23.081131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.080576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-config-volume\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.081131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.080688 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-tls-assets\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.081131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.080817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/55422fe6-0d03-4e20-95b2-44fa62edfdba-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.081131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.081068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55422fe6-0d03-4e20-95b2-44fa62edfdba-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.083398 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.083214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.084005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.083639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.089547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.089404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.093087 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.092996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.097196 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.097129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvd2j\" (UniqueName: \"kubernetes.io/projected/55422fe6-0d03-4e20-95b2-44fa62edfdba-kube-api-access-nvd2j\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.097702 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.097685 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-web-config\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.581100 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.580902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.584225 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.584199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/55422fe6-0d03-4e20-95b2-44fa62edfdba-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"55422fe6-0d03-4e20-95b2-44fa62edfdba\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:23.794111 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:23.794063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 16:26:24.880296 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:24.880254 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 16:26:24.885742 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:24.885709 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55422fe6_0d03_4e20_95b2_44fa62edfdba.slice/crio-eb03a0d835647d66dcfe1cbaa1771a9ed413b75540424ac990a7e63fe12520fa WatchSource:0}: Error finding container eb03a0d835647d66dcfe1cbaa1771a9ed413b75540424ac990a7e63fe12520fa: Status 404 returned error can't find the container with id eb03a0d835647d66dcfe1cbaa1771a9ed413b75540424ac990a7e63fe12520fa Apr 20 16:26:25.067319 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:25.067282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" event={"ID":"455b95ec-d6c4-4986-99be-0bf8c2e95935","Type":"ContainerStarted","Data":"8870c7345304f143a43fb9c2ab03406be4d66f84a858bcc442ae55ff89fa981f"} Apr 20 16:26:25.067506 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:25.067489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" event={"ID":"455b95ec-d6c4-4986-99be-0bf8c2e95935","Type":"ContainerStarted","Data":"9357cf8124c265cb308468da73345e654e12adfe39c2cf779e62585966ef3aa8"} Apr 20 16:26:25.069789 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:25.069035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kh54r" event={"ID":"ecdbf87a-e49b-4914-9b96-abd064658c90","Type":"ContainerStarted","Data":"051a6a29cc33a0d198218f39ea7e67ae452a92b9cf1e4d5a39ad5e97946897d9"} Apr 20 16:26:25.074512 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:25.074146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"eb03a0d835647d66dcfe1cbaa1771a9ed413b75540424ac990a7e63fe12520fa"} Apr 20 16:26:26.079388 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.079291 2571 generic.go:358] "Generic (PLEG): container finished" podID="ecdbf87a-e49b-4914-9b96-abd064658c90" containerID="051a6a29cc33a0d198218f39ea7e67ae452a92b9cf1e4d5a39ad5e97946897d9" exitCode=0 Apr 20 16:26:26.080011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.079398 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kh54r" event={"ID":"ecdbf87a-e49b-4914-9b96-abd064658c90","Type":"ContainerDied","Data":"051a6a29cc33a0d198218f39ea7e67ae452a92b9cf1e4d5a39ad5e97946897d9"} Apr 20 16:26:26.083145 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.083109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" event={"ID":"455b95ec-d6c4-4986-99be-0bf8c2e95935","Type":"ContainerStarted","Data":"df60104856ffe806d2cf8b0e698aaba2c77a675841879dfed49d2fd65f9391f0"} Apr 20 16:26:26.116399 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.116334 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-wgtbk" podStartSLOduration=2.698245849 podStartE2EDuration="5.116318256s" podCreationTimestamp="2026-04-20 16:26:21 +0000 UTC" firstStartedPulling="2026-04-20 16:26:22.29624087 +0000 UTC m=+185.416417369" lastFinishedPulling="2026-04-20 16:26:24.714313272 +0000 UTC m=+187.834489776" observedRunningTime="2026-04-20 16:26:26.11521572 +0000 UTC m=+189.235392240" watchObservedRunningTime="2026-04-20 16:26:26.116318256 +0000 UTC m=+189.236494775" Apr 20 16:26:26.196761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.196721 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-78d6dfc49-4vn24"] Apr 20 16:26:26.228496 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.228405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78d6dfc49-4vn24"] Apr 20 16:26:26.228688 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.228553 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.232814 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.232231 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 16:26:26.232814 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.232679 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-6tzj9\"" Apr 20 16:26:26.232993 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.232868 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ajph8ujktucia\"" Apr 20 16:26:26.232993 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.232913 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 16:26:26.233152 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.233136 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 16:26:26.233576 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.233350 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 16:26:26.307722 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.307683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qzm\" (UniqueName: \"kubernetes.io/projected/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-kube-api-access-v5qzm\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.307896 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.307844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-metrics-server-audit-profiles\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.307896 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.307888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-audit-log\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.308005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.307924 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-client-certs\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.308005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.307951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.308091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.308005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-tls\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.308091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.308029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-client-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.408966 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.408934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-metrics-server-audit-profiles\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409079 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.408985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-audit-log\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409079 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-client-certs\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409079 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409265 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-tls\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409265 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-client-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409265 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qzm\" (UniqueName: \"kubernetes.io/projected/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-kube-api-access-v5qzm\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409496 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-audit-log\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.409824 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.409797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.410077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.410054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-metrics-server-audit-profiles\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.412234 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.412189 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-client-certs\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.412329 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.412314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-client-ca-bundle\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.412382 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.412359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-secret-metrics-server-tls\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.418281 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.418262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qzm\" (UniqueName: \"kubernetes.io/projected/259deb39-6cd0-4be8-bc17-e4bd29b87f5d-kube-api-access-v5qzm\") pod \"metrics-server-78d6dfc49-4vn24\" (UID: \"259deb39-6cd0-4be8-bc17-e4bd29b87f5d\") " pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.541552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.541520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:26.698415 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:26.698381 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78d6dfc49-4vn24"] Apr 20 16:26:27.026060 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.026025 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-55569768f8-sx8nf"] Apr 20 16:26:27.043496 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.043473 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.046506 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.046482 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 16:26:27.046832 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.046813 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 16:26:27.047613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.047596 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 16:26:27.047711 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.047678 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 16:26:27.048638 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.048619 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-58nj8\"" Apr 20 16:26:27.048759 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.048743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 16:26:27.055113 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.055084 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55569768f8-sx8nf"] Apr 20 16:26:27.073117 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.073089 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 16:26:27.090642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.090611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kh54r" event={"ID":"ecdbf87a-e49b-4914-9b96-abd064658c90","Type":"ContainerStarted","Data":"0da12d5aa8ffc02b3b06c6fd81b71cc6c6354a7e69ce75f94f576a2913ac57a9"} Apr 20 16:26:27.092456 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.091062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kh54r" event={"ID":"ecdbf87a-e49b-4914-9b96-abd064658c90","Type":"ContainerStarted","Data":"81231e9fd23006ecd3849b286074fedd98aabc28875a096ba867ac7f170c4ced"} Apr 20 16:26:27.094495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.094462 2571 generic.go:358] "Generic (PLEG): container finished" podID="55422fe6-0d03-4e20-95b2-44fa62edfdba" containerID="5ba78efda93e510810f044e321da2c226921e7f4638ec85071219ebead69cbc4" exitCode=0 Apr 20 16:26:27.094597 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.094543 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerDied","Data":"5ba78efda93e510810f044e321da2c226921e7f4638ec85071219ebead69cbc4"} Apr 20 16:26:27.095927 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.095901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" event={"ID":"259deb39-6cd0-4be8-bc17-e4bd29b87f5d","Type":"ContainerStarted","Data":"2e4ad47ffb64b8341d5d57d15f4e271122c5e94f6d3ee004c2eae144395e20ba"} Apr 20 16:26:27.109624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.109579 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kh54r" podStartSLOduration=4.161569025 podStartE2EDuration="6.109563709s" podCreationTimestamp="2026-04-20 16:26:21 +0000 UTC" firstStartedPulling="2026-04-20 16:26:22.756286854 +0000 UTC m=+185.876463363" lastFinishedPulling="2026-04-20 16:26:24.704281542 +0000 UTC m=+187.824458047" observedRunningTime="2026-04-20 16:26:27.108819254 +0000 UTC m=+190.228995788" watchObservedRunningTime="2026-04-20 16:26:27.109563709 +0000 UTC m=+190.229740277" Apr 20 16:26:27.118724 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118691 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-federate-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.118874 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.118874 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpx4r\" (UniqueName: \"kubernetes.io/projected/2d13ef38-5530-4514-98a7-486963962908-kube-api-access-hpx4r\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.118976 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-telemeter-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.118976 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-serving-certs-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.119071 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.118980 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.119125 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.119107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-metrics-client-ca\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.119245 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.119224 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.219905 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.219821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220084 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-federate-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220149 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220245 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpx4r\" (UniqueName: \"kubernetes.io/projected/2d13ef38-5530-4514-98a7-486963962908-kube-api-access-hpx4r\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220301 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-telemeter-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220459 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-serving-certs-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220550 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220651 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-metrics-client-ca\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.220755 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.220731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.221062 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.221031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-serving-certs-ca-bundle\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.221835 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.221810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d13ef38-5530-4514-98a7-486963962908-metrics-client-ca\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.223145 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.223120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-federate-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.223421 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.223398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-telemeter-client-tls\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.223693 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.223674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.223974 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.223956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2d13ef38-5530-4514-98a7-486963962908-secret-telemeter-client\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.227898 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.227873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpx4r\" (UniqueName: \"kubernetes.io/projected/2d13ef38-5530-4514-98a7-486963962908-kube-api-access-hpx4r\") pod \"telemeter-client-55569768f8-sx8nf\" (UID: \"2d13ef38-5530-4514-98a7-486963962908\") " pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.355476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.355439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" Apr 20 16:26:27.507471 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.507391 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55569768f8-sx8nf"] Apr 20 16:26:27.515562 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:27.515133 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d13ef38_5530_4514_98a7_486963962908.slice/crio-4d9d772d0de50dbe3064feee57ecb2b8b0efd44ac39148c4b41a2b1aa66bd9bd WatchSource:0}: Error finding container 4d9d772d0de50dbe3064feee57ecb2b8b0efd44ac39148c4b41a2b1aa66bd9bd: Status 404 returned error can't find the container with id 4d9d772d0de50dbe3064feee57ecb2b8b0efd44ac39148c4b41a2b1aa66bd9bd Apr 20 16:26:27.609936 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:27.609894 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:28.104155 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.104123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" event={"ID":"2d13ef38-5530-4514-98a7-486963962908","Type":"ContainerStarted","Data":"4d9d772d0de50dbe3064feee57ecb2b8b0efd44ac39148c4b41a2b1aa66bd9bd"} Apr 20 16:26:28.454265 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.454197 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:28.488649 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.488613 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:28.488823 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.488798 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.498764 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.498732 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 16:26:28.639107 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639068 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g45t\" (UniqueName: \"kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639316 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639316 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639316 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.639520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.639502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740659 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740659 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g45t\" (UniqueName: \"kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.740879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.740830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.741655 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.741624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.743986 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.743089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.743986 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.743645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.744208 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.744187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.746079 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.746055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.747801 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.747776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.752604 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.752562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g45t\" (UniqueName: \"kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t\") pod \"console-5d746dbf9d-jn56b\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:28.803761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:28.803641 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:30.666903 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:30.666879 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:30.669084 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:30.669056 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9daf40_b1c7_4761_9a50_e872ac559162.slice/crio-74e686d050861aa033741d3ddb572c6c5791292a0373c7a815a660ea8ad485ef WatchSource:0}: Error finding container 74e686d050861aa033741d3ddb572c6c5791292a0373c7a815a660ea8ad485ef: Status 404 returned error can't find the container with id 74e686d050861aa033741d3ddb572c6c5791292a0373c7a815a660ea8ad485ef Apr 20 16:26:31.115317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.115239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" event={"ID":"2d13ef38-5530-4514-98a7-486963962908","Type":"ContainerStarted","Data":"1319accc9aea53bdd72c47e9181f13b2dac0ae9b7a77a022baa0cbc3d97aa5ea"} Apr 20 16:26:31.115317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.115281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" event={"ID":"2d13ef38-5530-4514-98a7-486963962908","Type":"ContainerStarted","Data":"355777f26a3ca50232936d5cfb5b2e87dcfbb7c8ec2416207f78508674a320e3"} Apr 20 16:26:31.115317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.115296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" event={"ID":"2d13ef38-5530-4514-98a7-486963962908","Type":"ContainerStarted","Data":"439162065c4237ca6b8ed2426108856fbc49fbf2a698f776fb3ee0c0e6839924"} Apr 20 16:26:31.118242 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.118208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"4fbe4e80caf3bc0568f96f2222dc27f03f0c84f9524b2d0af3078193832fe16b"} Apr 20 16:26:31.118375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.118250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"bdd8e2e022b0c741ff521a8396a95625ce0c2208bc5960d1d408eaae7259b0ef"} Apr 20 16:26:31.118375 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.118266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"23bfe5d7b75a44a382719a0ccbfab5894dc49ff1a26f6ca4b0d54c227f5ff8c3"} Apr 20 16:26:31.119961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.119901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" event={"ID":"259deb39-6cd0-4be8-bc17-e4bd29b87f5d","Type":"ContainerStarted","Data":"7444d48e553a4344d0e0e8f0e9b6b146c632c143c03935c459cf651a4a6836fa"} Apr 20 16:26:31.121385 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.121360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d746dbf9d-jn56b" event={"ID":"4a9daf40-b1c7-4761-9a50-e872ac559162","Type":"ContainerStarted","Data":"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0"} Apr 20 16:26:31.121492 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.121392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d746dbf9d-jn56b" event={"ID":"4a9daf40-b1c7-4761-9a50-e872ac559162","Type":"ContainerStarted","Data":"74e686d050861aa033741d3ddb572c6c5791292a0373c7a815a660ea8ad485ef"} Apr 20 16:26:31.140090 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.140044 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-55569768f8-sx8nf" podStartSLOduration=1.084199874 podStartE2EDuration="4.140028547s" podCreationTimestamp="2026-04-20 16:26:27 +0000 UTC" firstStartedPulling="2026-04-20 16:26:27.517840063 +0000 UTC m=+190.638016568" lastFinishedPulling="2026-04-20 16:26:30.573668733 +0000 UTC m=+193.693845241" observedRunningTime="2026-04-20 16:26:31.138650049 +0000 UTC m=+194.258826580" watchObservedRunningTime="2026-04-20 16:26:31.140028547 +0000 UTC m=+194.260205061" Apr 20 16:26:31.157571 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:31.157530 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" podStartSLOduration=1.459631882 podStartE2EDuration="5.15751751s" podCreationTimestamp="2026-04-20 16:26:26 +0000 UTC" firstStartedPulling="2026-04-20 16:26:26.870045593 +0000 UTC m=+189.990222090" lastFinishedPulling="2026-04-20 16:26:30.567931216 +0000 UTC m=+193.688107718" observedRunningTime="2026-04-20 16:26:31.156870969 +0000 UTC m=+194.277047490" watchObservedRunningTime="2026-04-20 16:26:31.15751751 +0000 UTC m=+194.277694007" Apr 20 16:26:32.128905 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.128851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"67f962b0aba16db0f731d4448ac3553ea1856b53b7fa56456709d2f99cb4d3b5"} Apr 20 16:26:32.129404 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.128914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"f24cba510c91737a96d464a0b27329ce06e953dda2954042371fbb028d72e1ca"} Apr 20 16:26:32.200278 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.200222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d746dbf9d-jn56b" podStartSLOduration=4.200203846 podStartE2EDuration="4.200203846s" podCreationTimestamp="2026-04-20 16:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:26:31.173761615 +0000 UTC m=+194.293938136" watchObservedRunningTime="2026-04-20 16:26:32.200203846 +0000 UTC m=+195.320380366" Apr 20 16:26:32.201401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.201345 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:32.246827 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.246793 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:26:32.257652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.257621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.262636 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.262608 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:26:32.379237 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379237 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljx7l\" (UniqueName: \"kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.379495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.379438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.479828 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.479804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.479906 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.479838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.479906 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.479880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljx7l\" (UniqueName: \"kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.479974 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.479908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.479974 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.479963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.480061 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.480032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.480111 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.480064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.480615 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.480586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.480735 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.480692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.480906 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.480889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.481065 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.481042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.482651 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.482630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.482708 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.482680 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.488070 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.488047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljx7l\" (UniqueName: \"kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l\") pod \"console-57c6c8c9b7-nqlr4\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.568317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.568285 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:32.697075 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:32.697041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:26:32.699870 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:26:32.699840 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ce0516_60e6_43c2_9944_e860a52dcce2.slice/crio-b04308d631cea87b0f0d092a23aaa55094462b2e7a072a4e6f96e7b672926585 WatchSource:0}: Error finding container b04308d631cea87b0f0d092a23aaa55094462b2e7a072a4e6f96e7b672926585: Status 404 returned error can't find the container with id b04308d631cea87b0f0d092a23aaa55094462b2e7a072a4e6f96e7b672926585 Apr 20 16:26:33.133353 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:33.133314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c6c8c9b7-nqlr4" event={"ID":"01ce0516-60e6-43c2-9944-e860a52dcce2","Type":"ContainerStarted","Data":"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662"} Apr 20 16:26:33.133353 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:33.133355 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c6c8c9b7-nqlr4" event={"ID":"01ce0516-60e6-43c2-9944-e860a52dcce2","Type":"ContainerStarted","Data":"b04308d631cea87b0f0d092a23aaa55094462b2e7a072a4e6f96e7b672926585"} Apr 20 16:26:33.136100 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:33.136069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"55422fe6-0d03-4e20-95b2-44fa62edfdba","Type":"ContainerStarted","Data":"707eed7ea7e27667f0f0fbbc5ca3459d46f217b43e5824c932fa9afd04e5168f"} Apr 20 16:26:33.150036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:33.149988 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57c6c8c9b7-nqlr4" podStartSLOduration=1.149973035 podStartE2EDuration="1.149973035s" podCreationTimestamp="2026-04-20 16:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:26:33.149089303 +0000 UTC m=+196.269265823" watchObservedRunningTime="2026-04-20 16:26:33.149973035 +0000 UTC m=+196.270149557" Apr 20 16:26:33.179276 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:33.179223 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.593981262 podStartE2EDuration="11.179209575s" podCreationTimestamp="2026-04-20 16:26:22 +0000 UTC" firstStartedPulling="2026-04-20 16:26:24.888613399 +0000 UTC m=+188.008789898" lastFinishedPulling="2026-04-20 16:26:32.473841706 +0000 UTC m=+195.594018211" observedRunningTime="2026-04-20 16:26:33.176953547 +0000 UTC m=+196.297130062" watchObservedRunningTime="2026-04-20 16:26:33.179209575 +0000 UTC m=+196.299386094" Apr 20 16:26:38.804630 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:38.804592 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:42.568947 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:42.568908 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:42.568947 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:42.568952 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:42.573845 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:42.573823 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:43.173604 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:43.173573 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:26:46.542280 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:46.542240 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:46.542280 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:46.542284 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:26:52.638782 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.638720 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8464968566-g6xz9" podUID="a20be14d-583a-4f25-950e-23a88f6e512a" containerName="console" containerID="cri-o://f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297" gracePeriod=15 Apr 20 16:26:52.895011 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.894957 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8464968566-g6xz9_a20be14d-583a-4f25-950e-23a88f6e512a/console/0.log" Apr 20 16:26:52.895114 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.895031 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:52.971791 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.971753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.971981 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.971838 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.971981 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.971875 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.972101 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972040 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn629\" (UniqueName: \"kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.972180 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972098 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.972180 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972125 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config\") pod \"a20be14d-583a-4f25-950e-23a88f6e512a\" (UID: \"a20be14d-583a-4f25-950e-23a88f6e512a\") " Apr 20 16:26:52.972287 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972265 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca" (OuterVolumeSpecName: "service-ca") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:52.972497 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972470 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-service-ca\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:52.972497 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972463 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:52.972819 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.972542 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config" (OuterVolumeSpecName: "console-config") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:52.974362 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.974328 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629" (OuterVolumeSpecName: "kube-api-access-pn629") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "kube-api-access-pn629". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:52.974715 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.974692 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:52.974807 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:52.974712 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a20be14d-583a-4f25-950e-23a88f6e512a" (UID: "a20be14d-583a-4f25-950e-23a88f6e512a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:53.073541 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.073488 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pn629\" (UniqueName: \"kubernetes.io/projected/a20be14d-583a-4f25-950e-23a88f6e512a-kube-api-access-pn629\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:53.073541 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.073533 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-oauth-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:53.073541 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.073543 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a20be14d-583a-4f25-950e-23a88f6e512a-console-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:53.073761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.073554 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-oauth-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:53.073761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.073562 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a20be14d-583a-4f25-950e-23a88f6e512a-console-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:53.198859 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198781 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8464968566-g6xz9_a20be14d-583a-4f25-950e-23a88f6e512a/console/0.log" Apr 20 16:26:53.198859 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198818 2571 generic.go:358] "Generic (PLEG): container finished" podID="a20be14d-583a-4f25-950e-23a88f6e512a" containerID="f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297" exitCode=2 Apr 20 16:26:53.199036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8464968566-g6xz9" event={"ID":"a20be14d-583a-4f25-950e-23a88f6e512a","Type":"ContainerDied","Data":"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297"} Apr 20 16:26:53.199036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198879 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8464968566-g6xz9" Apr 20 16:26:53.199036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8464968566-g6xz9" event={"ID":"a20be14d-583a-4f25-950e-23a88f6e512a","Type":"ContainerDied","Data":"c391c95661066edffc34d42414746830e8121b580094685343741a54c9b3bfe9"} Apr 20 16:26:53.199036 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.198908 2571 scope.go:117] "RemoveContainer" containerID="f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297" Apr 20 16:26:53.210726 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.210704 2571 scope.go:117] "RemoveContainer" containerID="f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297" Apr 20 16:26:53.211016 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:53.210992 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297\": container with ID starting with f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297 not found: ID does not exist" containerID="f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297" Apr 20 16:26:53.211088 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.211022 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297"} err="failed to get container status \"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297\": rpc error: code = NotFound desc = could not find container \"f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297\": container with ID starting with f4ae8d75dc2e5ed93568ffde26515493a731a7488fc12c97a6146c31d38ae297 not found: ID does not exist" Apr 20 16:26:53.221109 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.221081 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:53.228691 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.228670 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8464968566-g6xz9"] Apr 20 16:26:53.421992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:53.421956 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20be14d-583a-4f25-950e-23a88f6e512a" path="/var/lib/kubelet/pods/a20be14d-583a-4f25-950e-23a88f6e512a/volumes" Apr 20 16:26:56.208936 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:56.208901 2571 generic.go:358] "Generic (PLEG): container finished" podID="164b3066-3171-4ff5-b023-f49f644b1d28" containerID="eb3d075aacf7f6400164f0122f8792c74e49a6149c18dee608eef6b4b0f1bfde" exitCode=0 Apr 20 16:26:56.209360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:56.208986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" event={"ID":"164b3066-3171-4ff5-b023-f49f644b1d28","Type":"ContainerDied","Data":"eb3d075aacf7f6400164f0122f8792c74e49a6149c18dee608eef6b4b0f1bfde"} Apr 20 16:26:56.209430 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:56.209388 2571 scope.go:117] "RemoveContainer" containerID="eb3d075aacf7f6400164f0122f8792c74e49a6149c18dee608eef6b4b0f1bfde" Apr 20 16:26:57.213743 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:57.213705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-26lqh" event={"ID":"164b3066-3171-4ff5-b023-f49f644b1d28","Type":"ContainerStarted","Data":"a00af41e49aa545bb5436b73e22a25d6ab72e37209e38cce43a7e489153afd62"} Apr 20 16:26:58.156248 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.156157 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d746dbf9d-jn56b" podUID="4a9daf40-b1c7-4761-9a50-e872ac559162" containerName="console" containerID="cri-o://236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0" gracePeriod=15 Apr 20 16:26:58.415059 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.414992 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d746dbf9d-jn56b_4a9daf40-b1c7-4761-9a50-e872ac559162/console/0.log" Apr 20 16:26:58.415443 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.415067 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:58.520048 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520015 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520233 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520073 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520233 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520095 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g45t\" (UniqueName: \"kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520328 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520237 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520328 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520280 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520325 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520371 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config\") pod \"4a9daf40-b1c7-4761-9a50-e872ac559162\" (UID: \"4a9daf40-b1c7-4761-9a50-e872ac559162\") " Apr 20 16:26:58.520613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520544 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:58.520720 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520692 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config" (OuterVolumeSpecName: "console-config") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:58.520838 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520789 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-console-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.520838 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520808 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-trusted-ca-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.520838 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520817 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:58.520838 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.520820 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:58.522552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.522531 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t" (OuterVolumeSpecName: "kube-api-access-9g45t") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "kube-api-access-9g45t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:58.523031 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.523008 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:58.523102 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.523032 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a9daf40-b1c7-4761-9a50-e872ac559162" (UID: "4a9daf40-b1c7-4761-9a50-e872ac559162"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:58.622189 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.622123 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9g45t\" (UniqueName: \"kubernetes.io/projected/4a9daf40-b1c7-4761-9a50-e872ac559162-kube-api-access-9g45t\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.622189 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.622159 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-oauth-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.622394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.622202 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9daf40-b1c7-4761-9a50-e872ac559162-service-ca\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.622394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.622216 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-oauth-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:58.622394 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:58.622229 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9daf40-b1c7-4761-9a50-e872ac559162-console-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:26:59.220479 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220451 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d746dbf9d-jn56b_4a9daf40-b1c7-4761-9a50-e872ac559162/console/0.log" Apr 20 16:26:59.220642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220503 2571 generic.go:358] "Generic (PLEG): container finished" podID="4a9daf40-b1c7-4761-9a50-e872ac559162" containerID="236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0" exitCode=2 Apr 20 16:26:59.220642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220538 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d746dbf9d-jn56b" event={"ID":"4a9daf40-b1c7-4761-9a50-e872ac559162","Type":"ContainerDied","Data":"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0"} Apr 20 16:26:59.220642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d746dbf9d-jn56b" event={"ID":"4a9daf40-b1c7-4761-9a50-e872ac559162","Type":"ContainerDied","Data":"74e686d050861aa033741d3ddb572c6c5791292a0373c7a815a660ea8ad485ef"} Apr 20 16:26:59.220642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220575 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d746dbf9d-jn56b" Apr 20 16:26:59.220642 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.220592 2571 scope.go:117] "RemoveContainer" containerID="236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0" Apr 20 16:26:59.228831 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.228815 2571 scope.go:117] "RemoveContainer" containerID="236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0" Apr 20 16:26:59.229079 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:26:59.229062 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0\": container with ID starting with 236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0 not found: ID does not exist" containerID="236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0" Apr 20 16:26:59.229119 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.229088 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0"} err="failed to get container status \"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0\": rpc error: code = NotFound desc = could not find container \"236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0\": container with ID starting with 236182bc87fd7dee70803c7903336e51048359938e403d5e778317566fe5ebe0 not found: ID does not exist" Apr 20 16:26:59.241799 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.241772 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:59.247650 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.247630 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d746dbf9d-jn56b"] Apr 20 16:26:59.422520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:26:59.422487 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9daf40-b1c7-4761-9a50-e872ac559162" path="/var/lib/kubelet/pods/4a9daf40-b1c7-4761-9a50-e872ac559162/volumes" Apr 20 16:27:06.548415 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:06.548380 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:27:06.552401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:06.552377 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-78d6dfc49-4vn24" Apr 20 16:27:28.276763 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:28.276711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:27:28.279207 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:28.279186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff512ace-f73c-4265-890e-b43c9ecc782d-metrics-certs\") pod \"network-metrics-daemon-rxwd9\" (UID: \"ff512ace-f73c-4265-890e-b43c9ecc782d\") " pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:27:28.521399 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:28.521366 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:27:28.529627 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:28.529566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxwd9" Apr 20 16:27:28.655182 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:28.655069 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rxwd9"] Apr 20 16:27:28.657947 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:27:28.657922 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff512ace_f73c_4265_890e_b43c9ecc782d.slice/crio-cd40e09e8381a16064fb2696a8115f9ebbd6b47c3a8d4251dec5757498bc2907 WatchSource:0}: Error finding container cd40e09e8381a16064fb2696a8115f9ebbd6b47c3a8d4251dec5757498bc2907: Status 404 returned error can't find the container with id cd40e09e8381a16064fb2696a8115f9ebbd6b47c3a8d4251dec5757498bc2907 Apr 20 16:27:29.310414 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:29.310370 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxwd9" event={"ID":"ff512ace-f73c-4265-890e-b43c9ecc782d","Type":"ContainerStarted","Data":"cd40e09e8381a16064fb2696a8115f9ebbd6b47c3a8d4251dec5757498bc2907"} Apr 20 16:27:30.314668 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:30.314633 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxwd9" event={"ID":"ff512ace-f73c-4265-890e-b43c9ecc782d","Type":"ContainerStarted","Data":"150b0838acf467a8eb37e555fa7cdcb2e85f730bcb18efc501e6fa173b0b0bce"} Apr 20 16:27:30.314668 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:30.314676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxwd9" event={"ID":"ff512ace-f73c-4265-890e-b43c9ecc782d","Type":"ContainerStarted","Data":"9863dcd61e3b0fe26cd2d64e85cad4468c3767100df9c25a95f2ec4a4357288d"} Apr 20 16:27:30.332545 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:30.332497 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rxwd9" podStartSLOduration=252.495514538 podStartE2EDuration="4m13.332482754s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:27:28.659703028 +0000 UTC m=+251.779879526" lastFinishedPulling="2026-04-20 16:27:29.496671243 +0000 UTC m=+252.616847742" observedRunningTime="2026-04-20 16:27:30.330371158 +0000 UTC m=+253.450547680" watchObservedRunningTime="2026-04-20 16:27:30.332482754 +0000 UTC m=+253.452659273" Apr 20 16:27:38.092957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.092866 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093211 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a20be14d-583a-4f25-950e-23a88f6e512a" containerName="console" Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093224 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20be14d-583a-4f25-950e-23a88f6e512a" containerName="console" Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093236 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a9daf40-b1c7-4761-9a50-e872ac559162" containerName="console" Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093242 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9daf40-b1c7-4761-9a50-e872ac559162" containerName="console" Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093291 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a20be14d-583a-4f25-950e-23a88f6e512a" containerName="console" Apr 20 16:27:38.093401 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.093299 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a9daf40-b1c7-4761-9a50-e872ac559162" containerName="console" Apr 20 16:27:38.096305 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.096281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.107186 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.107140 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:27:38.157253 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sz4b\" (UniqueName: \"kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157606 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157606 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.157606 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.157527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258531 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sz4b\" (UniqueName: \"kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.258723 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.258717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.259273 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.259247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.259577 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.259539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.259577 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.259553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.259698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.259610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.261238 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.261212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.261392 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.261378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.266147 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.266130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sz4b\" (UniqueName: \"kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b\") pod \"console-dfd8478dd-8ng6t\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.405962 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.405925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:38.550181 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:38.550135 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:27:38.552588 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:27:38.552556 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8980c74_431a_4289_b5e9_24435034c04a.slice/crio-df76a8dcf474da98b376fabb7931e0bb0faccdff6389a28d908ae862e9ddfa8f WatchSource:0}: Error finding container df76a8dcf474da98b376fabb7931e0bb0faccdff6389a28d908ae862e9ddfa8f: Status 404 returned error can't find the container with id df76a8dcf474da98b376fabb7931e0bb0faccdff6389a28d908ae862e9ddfa8f Apr 20 16:27:39.341564 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:39.341525 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfd8478dd-8ng6t" event={"ID":"f8980c74-431a-4289-b5e9-24435034c04a","Type":"ContainerStarted","Data":"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127"} Apr 20 16:27:39.341564 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:39.341563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfd8478dd-8ng6t" event={"ID":"f8980c74-431a-4289-b5e9-24435034c04a","Type":"ContainerStarted","Data":"df76a8dcf474da98b376fabb7931e0bb0faccdff6389a28d908ae862e9ddfa8f"} Apr 20 16:27:39.360945 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:39.360895 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dfd8478dd-8ng6t" podStartSLOduration=1.360882489 podStartE2EDuration="1.360882489s" podCreationTimestamp="2026-04-20 16:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:27:39.358739613 +0000 UTC m=+262.478916132" watchObservedRunningTime="2026-04-20 16:27:39.360882489 +0000 UTC m=+262.481059010" Apr 20 16:27:48.406605 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:48.406563 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:48.407077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:48.406641 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:48.411480 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:48.411460 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:49.376629 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:49.376594 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:27:49.430104 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:27:49.430039 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:28:14.452485 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.452422 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57c6c8c9b7-nqlr4" podUID="01ce0516-60e6-43c2-9944-e860a52dcce2" containerName="console" containerID="cri-o://0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662" gracePeriod=15 Apr 20 16:28:14.694837 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.694814 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c6c8c9b7-nqlr4_01ce0516-60e6-43c2-9944-e860a52dcce2/console/0.log" Apr 20 16:28:14.694963 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.694874 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:28:14.783421 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783341 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783421 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783374 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783421 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783396 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783685 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783437 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljx7l\" (UniqueName: \"kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783685 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783458 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783685 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783476 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783685 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783533 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert\") pod \"01ce0516-60e6-43c2-9944-e860a52dcce2\" (UID: \"01ce0516-60e6-43c2-9944-e860a52dcce2\") " Apr 20 16:28:14.783885 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783826 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:28:14.783951 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783919 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca" (OuterVolumeSpecName: "service-ca") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:28:14.784005 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.783944 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config" (OuterVolumeSpecName: "console-config") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:28:14.784057 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.784006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:28:14.785696 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.785665 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:28:14.786152 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.786133 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:28:14.786152 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.786136 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l" (OuterVolumeSpecName: "kube-api-access-ljx7l") pod "01ce0516-60e6-43c2-9944-e860a52dcce2" (UID: "01ce0516-60e6-43c2-9944-e860a52dcce2"). InnerVolumeSpecName "kube-api-access-ljx7l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:28:14.884258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884205 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljx7l\" (UniqueName: \"kubernetes.io/projected/01ce0516-60e6-43c2-9944-e860a52dcce2-kube-api-access-ljx7l\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884253 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-console-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884263 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-service-ca\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884272 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-oauth-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884282 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-oauth-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884291 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ce0516-60e6-43c2-9944-e860a52dcce2-trusted-ca-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:14.884525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:14.884300 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01ce0516-60e6-43c2-9944-e860a52dcce2-console-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:15.454795 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c6c8c9b7-nqlr4_01ce0516-60e6-43c2-9944-e860a52dcce2/console/0.log" Apr 20 16:28:15.455210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454816 2571 generic.go:358] "Generic (PLEG): container finished" podID="01ce0516-60e6-43c2-9944-e860a52dcce2" containerID="0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662" exitCode=2 Apr 20 16:28:15.455210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c6c8c9b7-nqlr4" event={"ID":"01ce0516-60e6-43c2-9944-e860a52dcce2","Type":"ContainerDied","Data":"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662"} Apr 20 16:28:15.455210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c6c8c9b7-nqlr4" event={"ID":"01ce0516-60e6-43c2-9944-e860a52dcce2","Type":"ContainerDied","Data":"b04308d631cea87b0f0d092a23aaa55094462b2e7a072a4e6f96e7b672926585"} Apr 20 16:28:15.455210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454897 2571 scope.go:117] "RemoveContainer" containerID="0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662" Apr 20 16:28:15.455210 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.454894 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c6c8c9b7-nqlr4" Apr 20 16:28:15.462958 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.462941 2571 scope.go:117] "RemoveContainer" containerID="0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662" Apr 20 16:28:15.463247 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:28:15.463224 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662\": container with ID starting with 0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662 not found: ID does not exist" containerID="0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662" Apr 20 16:28:15.463353 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.463254 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662"} err="failed to get container status \"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662\": rpc error: code = NotFound desc = could not find container \"0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662\": container with ID starting with 0adf40bf9a1544d55871862e9d25157fb091b885aa37767b67f5548f2acd2662 not found: ID does not exist" Apr 20 16:28:15.473190 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.473148 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:28:15.476493 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.476463 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57c6c8c9b7-nqlr4"] Apr 20 16:28:15.948781 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.948743 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x9ldf"] Apr 20 16:28:15.949115 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.949100 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01ce0516-60e6-43c2-9944-e860a52dcce2" containerName="console" Apr 20 16:28:15.949183 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.949118 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ce0516-60e6-43c2-9944-e860a52dcce2" containerName="console" Apr 20 16:28:15.949225 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.949194 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="01ce0516-60e6-43c2-9944-e860a52dcce2" containerName="console" Apr 20 16:28:15.953894 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.953876 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:15.956846 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.956818 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 16:28:15.958463 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.958441 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x9ldf"] Apr 20 16:28:15.995316 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.995286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1a2e82d-6f85-471f-a08a-55a114f41ec6-original-pull-secret\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:15.995477 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.995335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-kubelet-config\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:15.995477 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:15.995454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-dbus\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.096828 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.096795 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-kubelet-config\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.096953 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.096855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-dbus\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.096953 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.096885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1a2e82d-6f85-471f-a08a-55a114f41ec6-original-pull-secret\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.096953 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.096922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-kubelet-config\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.097070 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.097036 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1a2e82d-6f85-471f-a08a-55a114f41ec6-dbus\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.099372 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.099348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1a2e82d-6f85-471f-a08a-55a114f41ec6-original-pull-secret\") pod \"global-pull-secret-syncer-x9ldf\" (UID: \"d1a2e82d-6f85-471f-a08a-55a114f41ec6\") " pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.263826 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.263749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9ldf" Apr 20 16:28:16.382699 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.382625 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x9ldf"] Apr 20 16:28:16.385419 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:28:16.385391 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a2e82d_6f85_471f_a08a_55a114f41ec6.slice/crio-ff98e58774f214c4db95bd10f01fa971ee2c3466b67d50c83a1ed0f7b097e8fe WatchSource:0}: Error finding container ff98e58774f214c4db95bd10f01fa971ee2c3466b67d50c83a1ed0f7b097e8fe: Status 404 returned error can't find the container with id ff98e58774f214c4db95bd10f01fa971ee2c3466b67d50c83a1ed0f7b097e8fe Apr 20 16:28:16.459946 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:16.459911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x9ldf" event={"ID":"d1a2e82d-6f85-471f-a08a-55a114f41ec6","Type":"ContainerStarted","Data":"ff98e58774f214c4db95bd10f01fa971ee2c3466b67d50c83a1ed0f7b097e8fe"} Apr 20 16:28:17.312318 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:17.312285 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:28:17.313812 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:17.313497 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:28:17.318543 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:17.318525 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 16:28:17.422623 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:17.422581 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ce0516-60e6-43c2-9944-e860a52dcce2" path="/var/lib/kubelet/pods/01ce0516-60e6-43c2-9944-e860a52dcce2/volumes" Apr 20 16:28:20.473314 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:20.473276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x9ldf" event={"ID":"d1a2e82d-6f85-471f-a08a-55a114f41ec6","Type":"ContainerStarted","Data":"61b6f6d05de147391131e2f530b7f8fad44a3c32565c8ff52faad89ac4ee84b1"} Apr 20 16:28:20.488481 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:20.488435 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x9ldf" podStartSLOduration=1.837816519 podStartE2EDuration="5.488419992s" podCreationTimestamp="2026-04-20 16:28:15 +0000 UTC" firstStartedPulling="2026-04-20 16:28:16.387454764 +0000 UTC m=+299.507631262" lastFinishedPulling="2026-04-20 16:28:20.038058217 +0000 UTC m=+303.158234735" observedRunningTime="2026-04-20 16:28:20.487845979 +0000 UTC m=+303.608022500" watchObservedRunningTime="2026-04-20 16:28:20.488419992 +0000 UTC m=+303.608596514" Apr 20 16:28:30.298650 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.298610 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt"] Apr 20 16:28:30.302422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.302403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.305094 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.305068 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:28:30.305246 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.305198 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:28:30.305246 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.305220 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:28:30.312461 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.312437 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt"] Apr 20 16:28:30.416147 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.416108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.416358 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.416225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.416358 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.416275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc95h\" (UniqueName: \"kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.517541 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.517503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.517733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.517571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc95h\" (UniqueName: \"kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.517733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.517604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.517978 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.517952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.518053 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.517966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.527479 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.527451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc95h\" (UniqueName: \"kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.612517 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.612490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:30.736258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.736134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt"] Apr 20 16:28:30.738890 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:28:30.738860 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30d32d6_32df_4a67_8cb4_7b9db9152ee7.slice/crio-d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78 WatchSource:0}: Error finding container d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78: Status 404 returned error can't find the container with id d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78 Apr 20 16:28:30.740670 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:30.740652 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:28:31.514567 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:31.514524 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" event={"ID":"b30d32d6-32df-4a67-8cb4-7b9db9152ee7","Type":"ContainerStarted","Data":"d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78"} Apr 20 16:28:37.535303 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:37.535262 2571 generic.go:358] "Generic (PLEG): container finished" podID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerID="db625eeb059dffe9ea3c50d863e796265d10f065f770af4220775e1e9640d9ef" exitCode=0 Apr 20 16:28:37.535689 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:37.535344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" event={"ID":"b30d32d6-32df-4a67-8cb4-7b9db9152ee7","Type":"ContainerDied","Data":"db625eeb059dffe9ea3c50d863e796265d10f065f770af4220775e1e9640d9ef"} Apr 20 16:28:40.546292 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:40.546255 2571 generic.go:358] "Generic (PLEG): container finished" podID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerID="30ce063340d465c03861167cfc7058dc7e72a78b015eacf4557fc721efbd4761" exitCode=0 Apr 20 16:28:40.546681 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:40.546315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" event={"ID":"b30d32d6-32df-4a67-8cb4-7b9db9152ee7","Type":"ContainerDied","Data":"30ce063340d465c03861167cfc7058dc7e72a78b015eacf4557fc721efbd4761"} Apr 20 16:28:48.573895 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:48.573804 2571 generic.go:358] "Generic (PLEG): container finished" podID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerID="fcdbeb56896b55479ba6570108fbe8570889bb54a4e9a4461445ca32cad47d24" exitCode=0 Apr 20 16:28:48.573895 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:48.573843 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" event={"ID":"b30d32d6-32df-4a67-8cb4-7b9db9152ee7","Type":"ContainerDied","Data":"fcdbeb56896b55479ba6570108fbe8570889bb54a4e9a4461445ca32cad47d24"} Apr 20 16:28:49.704640 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.704616 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:49.796760 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.796726 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc95h\" (UniqueName: \"kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h\") pod \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " Apr 20 16:28:49.796943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.796775 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle\") pod \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " Apr 20 16:28:49.797009 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.796957 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util\") pod \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\" (UID: \"b30d32d6-32df-4a67-8cb4-7b9db9152ee7\") " Apr 20 16:28:49.797465 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.797431 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle" (OuterVolumeSpecName: "bundle") pod "b30d32d6-32df-4a67-8cb4-7b9db9152ee7" (UID: "b30d32d6-32df-4a67-8cb4-7b9db9152ee7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:28:49.799127 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.799102 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h" (OuterVolumeSpecName: "kube-api-access-xc95h") pod "b30d32d6-32df-4a67-8cb4-7b9db9152ee7" (UID: "b30d32d6-32df-4a67-8cb4-7b9db9152ee7"). InnerVolumeSpecName "kube-api-access-xc95h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:28:49.801520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.801487 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util" (OuterVolumeSpecName: "util") pod "b30d32d6-32df-4a67-8cb4-7b9db9152ee7" (UID: "b30d32d6-32df-4a67-8cb4-7b9db9152ee7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:28:49.897709 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.897668 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xc95h\" (UniqueName: \"kubernetes.io/projected/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-kube-api-access-xc95h\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:49.897709 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.897706 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:49.897899 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:49.897720 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30d32d6-32df-4a67-8cb4-7b9db9152ee7-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:28:50.581601 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:50.581568 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" Apr 20 16:28:50.581792 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:50.581567 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sj8gt" event={"ID":"b30d32d6-32df-4a67-8cb4-7b9db9152ee7","Type":"ContainerDied","Data":"d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78"} Apr 20 16:28:50.581792 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:50.581675 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b7a07f2f01b95b76c7cb8f38a25c2da1670c1d9aa74e0900ba2fbbc3397f78" Apr 20 16:28:57.756680 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.756647 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757011 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="util" Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757022 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="util" Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757033 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="extract" Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757041 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="extract" Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757053 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="pull" Apr 20 16:28:57.757080 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757061 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="pull" Apr 20 16:28:57.757312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.757115 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b30d32d6-32df-4a67-8cb4-7b9db9152ee7" containerName="extract" Apr 20 16:28:57.760358 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.760341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.763518 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.763491 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 16:28:57.765566 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.765549 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:28:57.765921 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.765895 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-czkbg\"" Apr 20 16:28:57.795943 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.795915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:28:57.859278 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.859250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.859435 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.859294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvxs\" (UniqueName: \"kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.960692 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.960646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.960863 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.960709 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvxs\" (UniqueName: \"kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.961251 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.961230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:57.970871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:57.970837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvxs\" (UniqueName: \"kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-54qdf\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:58.069974 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:58.069887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:28:58.203873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:58.203845 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:28:58.206856 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:28:58.206824 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd35da9_b962_49b2_87f7_6d80e50b90ea.slice/crio-da783a7a5e0ad8bb41b457027c0c6b0b0d853b5ea4351e9ef1f6fe73d75fcdbb WatchSource:0}: Error finding container da783a7a5e0ad8bb41b457027c0c6b0b0d853b5ea4351e9ef1f6fe73d75fcdbb: Status 404 returned error can't find the container with id da783a7a5e0ad8bb41b457027c0c6b0b0d853b5ea4351e9ef1f6fe73d75fcdbb Apr 20 16:28:58.608776 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:28:58.608741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" event={"ID":"dbd35da9-b962-49b2-87f7-6d80e50b90ea","Type":"ContainerStarted","Data":"da783a7a5e0ad8bb41b457027c0c6b0b0d853b5ea4351e9ef1f6fe73d75fcdbb"} Apr 20 16:29:00.620968 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:00.620928 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" event={"ID":"dbd35da9-b962-49b2-87f7-6d80e50b90ea","Type":"ContainerStarted","Data":"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59"} Apr 20 16:29:00.642481 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:00.642421 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" podStartSLOduration=1.6322758309999998 podStartE2EDuration="3.642403016s" podCreationTimestamp="2026-04-20 16:28:57 +0000 UTC" firstStartedPulling="2026-04-20 16:28:58.209457002 +0000 UTC m=+341.329633499" lastFinishedPulling="2026-04-20 16:29:00.219584176 +0000 UTC m=+343.339760684" observedRunningTime="2026-04-20 16:29:00.640307227 +0000 UTC m=+343.760483746" watchObservedRunningTime="2026-04-20 16:29:00.642403016 +0000 UTC m=+343.762579538" Apr 20 16:29:01.589921 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.589881 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh"] Apr 20 16:29:01.594156 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.594131 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.596931 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.596907 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:29:01.597193 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.597157 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:29:01.597866 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.597848 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:29:01.601559 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.601512 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh"] Apr 20 16:29:01.695440 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.695399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.695879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.695444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9nv\" (UniqueName: \"kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.695879 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.695493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.796103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.796060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.796312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.796186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.796312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.796219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9nv\" (UniqueName: \"kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.796559 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.796537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.796601 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.796547 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.814460 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.814434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9nv\" (UniqueName: \"kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:01.905951 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:01.905921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:02.036494 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:02.036468 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh"] Apr 20 16:29:02.039011 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:02.038983 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73598773_5748_4bf2_81fe_6669a9f8a48b.slice/crio-5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b WatchSource:0}: Error finding container 5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b: Status 404 returned error can't find the container with id 5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b Apr 20 16:29:02.630070 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:02.630030 2571 generic.go:358] "Generic (PLEG): container finished" podID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerID="fb3140359c1325ea874f13c9cb6359739fd5da92da44c5b632041aadf2d7008f" exitCode=0 Apr 20 16:29:02.630267 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:02.630143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" event={"ID":"73598773-5748-4bf2-81fe-6669a9f8a48b","Type":"ContainerDied","Data":"fb3140359c1325ea874f13c9cb6359739fd5da92da44c5b632041aadf2d7008f"} Apr 20 16:29:02.630267 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:02.630186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" event={"ID":"73598773-5748-4bf2-81fe-6669a9f8a48b","Type":"ContainerStarted","Data":"5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b"} Apr 20 16:29:03.798613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.798574 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:29:03.801263 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.801237 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:03.803761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.803733 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 16:29:03.803910 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.803889 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 16:29:03.804969 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.804945 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-98xnj\"" Apr 20 16:29:03.812232 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.812208 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:29:03.915360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.915214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:03.915360 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:03.915280 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfj7\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.016637 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.016604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.016808 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.016651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfj7\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.025309 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.025273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.025537 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.025335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfj7\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7\") pod \"cert-manager-webhook-597b96b99b-qgzqq\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.121276 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.121235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:04.715955 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:04.715921 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:29:04.719197 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:04.719133 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod395c99df_a1dd_44da_a0aa_fa7e2b9c411e.slice/crio-01255f051356196dba9236b303fcc7bfe0472114ddd920c37765daecd95a17f2 WatchSource:0}: Error finding container 01255f051356196dba9236b303fcc7bfe0472114ddd920c37765daecd95a17f2: Status 404 returned error can't find the container with id 01255f051356196dba9236b303fcc7bfe0472114ddd920c37765daecd95a17f2 Apr 20 16:29:05.313963 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.313873 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:29:05.316332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.316308 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.318929 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.318909 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2qn7s\"" Apr 20 16:29:05.327043 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.326914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.327142 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.327062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zds\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.329257 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.329238 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:29:05.427976 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.427939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.428195 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.428052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zds\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.436211 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.436158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.436418 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.436397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zds\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds\") pod \"cert-manager-cainjector-8966b78d4-tx5ll\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.625712 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.625683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:29:05.645808 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.645134 2571 generic.go:358] "Generic (PLEG): container finished" podID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerID="f8372eb6563e59ea44c70529e97ed001c4b6d2cbb5b67ac6fcd24f4def6a6d6a" exitCode=0 Apr 20 16:29:05.645808 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.645277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" event={"ID":"73598773-5748-4bf2-81fe-6669a9f8a48b","Type":"ContainerDied","Data":"f8372eb6563e59ea44c70529e97ed001c4b6d2cbb5b67ac6fcd24f4def6a6d6a"} Apr 20 16:29:05.646873 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.646849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" event={"ID":"395c99df-a1dd-44da-a0aa-fa7e2b9c411e","Type":"ContainerStarted","Data":"01255f051356196dba9236b303fcc7bfe0472114ddd920c37765daecd95a17f2"} Apr 20 16:29:05.782397 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:05.782364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:29:05.786263 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:05.786228 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00eb42b0_e5c0_4e3e_a2d2_2658145ac2a1.slice/crio-19ac8885be9663861f6d174a9fb98de6c788a50185d5b53edefc5569efe3c95b WatchSource:0}: Error finding container 19ac8885be9663861f6d174a9fb98de6c788a50185d5b53edefc5569efe3c95b: Status 404 returned error can't find the container with id 19ac8885be9663861f6d174a9fb98de6c788a50185d5b53edefc5569efe3c95b Apr 20 16:29:06.653273 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:06.652960 2571 generic.go:358] "Generic (PLEG): container finished" podID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerID="210246b1f5e356a3f6e10798d82cc499225f1c6871b8bffe4b7a94487eb1d32d" exitCode=0 Apr 20 16:29:06.653273 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:06.653052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" event={"ID":"73598773-5748-4bf2-81fe-6669a9f8a48b","Type":"ContainerDied","Data":"210246b1f5e356a3f6e10798d82cc499225f1c6871b8bffe4b7a94487eb1d32d"} Apr 20 16:29:06.654822 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:06.654794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" event={"ID":"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1","Type":"ContainerStarted","Data":"19ac8885be9663861f6d174a9fb98de6c788a50185d5b53edefc5569efe3c95b"} Apr 20 16:29:07.847447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.847422 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:07.947890 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.947864 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util\") pod \"73598773-5748-4bf2-81fe-6669a9f8a48b\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " Apr 20 16:29:07.947990 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.947920 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9nv\" (UniqueName: \"kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv\") pod \"73598773-5748-4bf2-81fe-6669a9f8a48b\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " Apr 20 16:29:07.947990 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.947961 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle\") pod \"73598773-5748-4bf2-81fe-6669a9f8a48b\" (UID: \"73598773-5748-4bf2-81fe-6669a9f8a48b\") " Apr 20 16:29:07.948420 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.948381 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle" (OuterVolumeSpecName: "bundle") pod "73598773-5748-4bf2-81fe-6669a9f8a48b" (UID: "73598773-5748-4bf2-81fe-6669a9f8a48b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:07.950434 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.950406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv" (OuterVolumeSpecName: "kube-api-access-4h9nv") pod "73598773-5748-4bf2-81fe-6669a9f8a48b" (UID: "73598773-5748-4bf2-81fe-6669a9f8a48b"). InnerVolumeSpecName "kube-api-access-4h9nv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:29:07.955495 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:07.955464 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util" (OuterVolumeSpecName: "util") pod "73598773-5748-4bf2-81fe-6669a9f8a48b" (UID: "73598773-5748-4bf2-81fe-6669a9f8a48b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:08.049009 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.048970 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:08.049009 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.049006 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4h9nv\" (UniqueName: \"kubernetes.io/projected/73598773-5748-4bf2-81fe-6669a9f8a48b-kube-api-access-4h9nv\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:08.049240 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.049020 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73598773-5748-4bf2-81fe-6669a9f8a48b-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:08.664956 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.664931 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" Apr 20 16:29:08.665123 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.664929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnkjxh" event={"ID":"73598773-5748-4bf2-81fe-6669a9f8a48b","Type":"ContainerDied","Data":"5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b"} Apr 20 16:29:08.665123 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.665044 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a84a84126345f8da8523af69d63633257e2dbadbcd58dcb10410302be079a8b" Apr 20 16:29:08.666424 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.666397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" event={"ID":"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1","Type":"ContainerStarted","Data":"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6"} Apr 20 16:29:08.667732 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.667708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" event={"ID":"395c99df-a1dd-44da-a0aa-fa7e2b9c411e","Type":"ContainerStarted","Data":"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c"} Apr 20 16:29:08.667871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.667800 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:08.681247 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.681201 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" podStartSLOduration=1.5955255689999999 podStartE2EDuration="3.681184784s" podCreationTimestamp="2026-04-20 16:29:05 +0000 UTC" firstStartedPulling="2026-04-20 16:29:05.789969277 +0000 UTC m=+348.910145774" lastFinishedPulling="2026-04-20 16:29:07.875628491 +0000 UTC m=+350.995804989" observedRunningTime="2026-04-20 16:29:08.680670789 +0000 UTC m=+351.800847311" watchObservedRunningTime="2026-04-20 16:29:08.681184784 +0000 UTC m=+351.801361300" Apr 20 16:29:08.702746 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:08.702696 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" podStartSLOduration=2.538757646 podStartE2EDuration="5.702680698s" podCreationTimestamp="2026-04-20 16:29:03 +0000 UTC" firstStartedPulling="2026-04-20 16:29:04.721006753 +0000 UTC m=+347.841183251" lastFinishedPulling="2026-04-20 16:29:07.884929806 +0000 UTC m=+351.005106303" observedRunningTime="2026-04-20 16:29:08.701624245 +0000 UTC m=+351.821800770" watchObservedRunningTime="2026-04-20 16:29:08.702680698 +0000 UTC m=+351.822857217" Apr 20 16:29:14.673927 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:14.673898 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:29:22.072903 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.072860 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt"] Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073344 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="util" Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073360 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="util" Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073372 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="pull" Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073380 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="pull" Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073402 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="extract" Apr 20 16:29:22.073422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073410 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="extract" Apr 20 16:29:22.073745 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.073494 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="73598773-5748-4bf2-81fe-6669a9f8a48b" containerName="extract" Apr 20 16:29:22.079915 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.079882 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.084186 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.084142 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:29:22.084345 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.084316 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:29:22.084455 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.084436 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:29:22.093763 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.093736 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt"] Apr 20 16:29:22.162824 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.162785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.162992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.162840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.162992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.162915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8zxx\" (UniqueName: \"kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.263416 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.263371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8zxx\" (UniqueName: \"kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.263416 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.263424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.263625 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.263460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.263836 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.263819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.263904 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.263864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.272089 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.272064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8zxx\" (UniqueName: \"kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.392461 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.392426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:22.516494 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.516464 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt"] Apr 20 16:29:22.517737 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:22.517710 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234bfb1c_b6b4_4c08_858f_07d6e4b4e23e.slice/crio-8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a WatchSource:0}: Error finding container 8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a: Status 404 returned error can't find the container with id 8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a Apr 20 16:29:22.720565 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.720454 2571 generic.go:358] "Generic (PLEG): container finished" podID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerID="d09c7b8aa4784c9ab65cd2f4f5504e6c01432b13ee415dde448ebbfed7133177" exitCode=0 Apr 20 16:29:22.720565 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.720550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" event={"ID":"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e","Type":"ContainerDied","Data":"d09c7b8aa4784c9ab65cd2f4f5504e6c01432b13ee415dde448ebbfed7133177"} Apr 20 16:29:22.720773 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:22.720586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" event={"ID":"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e","Type":"ContainerStarted","Data":"8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a"} Apr 20 16:29:23.725492 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:23.725456 2571 generic.go:358] "Generic (PLEG): container finished" podID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerID="0598ab23b4067e7abd1d8adaf2e194c9d05b3748c4b63e11a2f27044bcd9c20a" exitCode=0 Apr 20 16:29:23.725958 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:23.725552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" event={"ID":"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e","Type":"ContainerDied","Data":"0598ab23b4067e7abd1d8adaf2e194c9d05b3748c4b63e11a2f27044bcd9c20a"} Apr 20 16:29:24.730903 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:24.730864 2571 generic.go:358] "Generic (PLEG): container finished" podID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerID="09b091963d794e3053936c7b74318033bee5e93e8a7f9316fab17f223953c0b4" exitCode=0 Apr 20 16:29:24.731306 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:24.730915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" event={"ID":"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e","Type":"ContainerDied","Data":"09b091963d794e3053936c7b74318033bee5e93e8a7f9316fab17f223953c0b4"} Apr 20 16:29:25.853236 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.853212 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:25.894734 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.894709 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util\") pod \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " Apr 20 16:29:25.894854 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.894763 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle\") pod \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " Apr 20 16:29:25.894854 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.894840 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8zxx\" (UniqueName: \"kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx\") pod \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\" (UID: \"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e\") " Apr 20 16:29:25.895525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.895490 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle" (OuterVolumeSpecName: "bundle") pod "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" (UID: "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:25.897006 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.896982 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx" (OuterVolumeSpecName: "kube-api-access-p8zxx") pod "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" (UID: "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e"). InnerVolumeSpecName "kube-api-access-p8zxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:29:25.902843 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.902817 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util" (OuterVolumeSpecName: "util") pod "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" (UID: "234bfb1c-b6b4-4c08-858f-07d6e4b4e23e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:25.995658 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.995583 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:25.995658 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.995610 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:25.995658 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:25.995620 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8zxx\" (UniqueName: \"kubernetes.io/projected/234bfb1c-b6b4-4c08-858f-07d6e4b4e23e-kube-api-access-p8zxx\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:26.739291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:26.739251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" event={"ID":"234bfb1c-b6b4-4c08-858f-07d6e4b4e23e","Type":"ContainerDied","Data":"8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a"} Apr 20 16:29:26.739291 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:26.739293 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eb42ec5704303b1fb1d605503aed3a760b71d0a92acfa5d181a6916ce2c194a" Apr 20 16:29:26.739508 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:26.739313 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xg5jt" Apr 20 16:29:33.088122 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088083 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k"] Apr 20 16:29:33.088637 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088621 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="pull" Apr 20 16:29:33.088707 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088642 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="pull" Apr 20 16:29:33.088707 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088662 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="util" Apr 20 16:29:33.088707 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088670 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="util" Apr 20 16:29:33.088707 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088679 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="extract" Apr 20 16:29:33.088707 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088689 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="extract" Apr 20 16:29:33.088942 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.088785 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="234bfb1c-b6b4-4c08-858f-07d6e4b4e23e" containerName="extract" Apr 20 16:29:33.090854 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.090833 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.093643 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.093617 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:29:33.093740 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.093628 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:29:33.093740 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.093666 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:29:33.099896 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.099874 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k"] Apr 20 16:29:33.157693 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.157666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.157863 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.157713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrkl\" (UniqueName: \"kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.157863 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.157737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.259103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.259068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.259312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.259135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrkl\" (UniqueName: \"kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.259312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.259198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.259500 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.259483 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.259586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.259562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.268002 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.267976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrkl\" (UniqueName: \"kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.400700 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.400665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:33.527276 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.527203 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k"] Apr 20 16:29:33.529870 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:33.529838 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027de419_7fd4_486c_91b2_0a7208f030a9.slice/crio-a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7 WatchSource:0}: Error finding container a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7: Status 404 returned error can't find the container with id a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7 Apr 20 16:29:33.762891 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.762802 2571 generic.go:358] "Generic (PLEG): container finished" podID="027de419-7fd4-486c-91b2-0a7208f030a9" containerID="233080ae10442ea17182c7a0d87968e9486e9eac9b4c87998190db17e5944706" exitCode=0 Apr 20 16:29:33.762891 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.762857 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" event={"ID":"027de419-7fd4-486c-91b2-0a7208f030a9","Type":"ContainerDied","Data":"233080ae10442ea17182c7a0d87968e9486e9eac9b4c87998190db17e5944706"} Apr 20 16:29:33.763075 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:33.762901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" event={"ID":"027de419-7fd4-486c-91b2-0a7208f030a9","Type":"ContainerStarted","Data":"a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7"} Apr 20 16:29:34.767880 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.767848 2571 generic.go:358] "Generic (PLEG): container finished" podID="027de419-7fd4-486c-91b2-0a7208f030a9" containerID="5cd310ae0b26bc6d8a6acf829228d1ed9f5200c7b70a6d0f97c757e2d50a190b" exitCode=0 Apr 20 16:29:34.768307 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.767935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" event={"ID":"027de419-7fd4-486c-91b2-0a7208f030a9","Type":"ContainerDied","Data":"5cd310ae0b26bc6d8a6acf829228d1ed9f5200c7b70a6d0f97c757e2d50a190b"} Apr 20 16:29:34.956182 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.956136 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8"] Apr 20 16:29:34.958487 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.958467 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:34.961337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.961316 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 16:29:34.961440 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.961414 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 16:29:34.961965 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.961951 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kwhtg\"" Apr 20 16:29:34.962590 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.962573 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 16:29:34.962657 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.962603 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 16:29:34.974993 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:34.974969 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8"] Apr 20 16:29:35.073334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.073245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.073334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.073292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f5n\" (UniqueName: \"kubernetes.io/projected/6011a6ba-7e4c-4e6b-be9c-31101383f90d-kube-api-access-42f5n\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.073525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.073374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.174029 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.173990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.174029 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.174034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42f5n\" (UniqueName: \"kubernetes.io/projected/6011a6ba-7e4c-4e6b-be9c-31101383f90d-kube-api-access-42f5n\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.174260 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.174062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.176562 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.176532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.176674 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.176569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6011a6ba-7e4c-4e6b-be9c-31101383f90d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.184955 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.184930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f5n\" (UniqueName: \"kubernetes.io/projected/6011a6ba-7e4c-4e6b-be9c-31101383f90d-kube-api-access-42f5n\") pod \"opendatahub-operator-controller-manager-59c64b9875-vdbl8\" (UID: \"6011a6ba-7e4c-4e6b-be9c-31101383f90d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.292904 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.292871 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:35.425010 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.424988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8"] Apr 20 16:29:35.427232 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:35.427200 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6011a6ba_7e4c_4e6b_be9c_31101383f90d.slice/crio-a7a54be012f9ab4a9c39a08000ba66477386436e87695b72c2f4cb102e04d4a6 WatchSource:0}: Error finding container a7a54be012f9ab4a9c39a08000ba66477386436e87695b72c2f4cb102e04d4a6: Status 404 returned error can't find the container with id a7a54be012f9ab4a9c39a08000ba66477386436e87695b72c2f4cb102e04d4a6 Apr 20 16:29:35.773808 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.773768 2571 generic.go:358] "Generic (PLEG): container finished" podID="027de419-7fd4-486c-91b2-0a7208f030a9" containerID="03f386029780d4dc464cda90c7adf055931ff4deddf1da2337bdea923108f86d" exitCode=0 Apr 20 16:29:35.774278 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.773850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" event={"ID":"027de419-7fd4-486c-91b2-0a7208f030a9","Type":"ContainerDied","Data":"03f386029780d4dc464cda90c7adf055931ff4deddf1da2337bdea923108f86d"} Apr 20 16:29:35.780719 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:35.780686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" event={"ID":"6011a6ba-7e4c-4e6b-be9c-31101383f90d","Type":"ContainerStarted","Data":"a7a54be012f9ab4a9c39a08000ba66477386436e87695b72c2f4cb102e04d4a6"} Apr 20 16:29:37.260443 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.260416 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:37.397586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.397551 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkrkl\" (UniqueName: \"kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl\") pod \"027de419-7fd4-486c-91b2-0a7208f030a9\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " Apr 20 16:29:37.397741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.397609 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util\") pod \"027de419-7fd4-486c-91b2-0a7208f030a9\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " Apr 20 16:29:37.397741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.397663 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle\") pod \"027de419-7fd4-486c-91b2-0a7208f030a9\" (UID: \"027de419-7fd4-486c-91b2-0a7208f030a9\") " Apr 20 16:29:37.398601 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.398556 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle" (OuterVolumeSpecName: "bundle") pod "027de419-7fd4-486c-91b2-0a7208f030a9" (UID: "027de419-7fd4-486c-91b2-0a7208f030a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:37.400128 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.400103 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl" (OuterVolumeSpecName: "kube-api-access-fkrkl") pod "027de419-7fd4-486c-91b2-0a7208f030a9" (UID: "027de419-7fd4-486c-91b2-0a7208f030a9"). InnerVolumeSpecName "kube-api-access-fkrkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:29:37.404096 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.404060 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util" (OuterVolumeSpecName: "util") pod "027de419-7fd4-486c-91b2-0a7208f030a9" (UID: "027de419-7fd4-486c-91b2-0a7208f030a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:37.499156 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.499067 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkrkl\" (UniqueName: \"kubernetes.io/projected/027de419-7fd4-486c-91b2-0a7208f030a9-kube-api-access-fkrkl\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:37.499156 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.499109 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:37.499156 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.499125 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027de419-7fd4-486c-91b2-0a7208f030a9-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:37.791781 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.791690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" event={"ID":"027de419-7fd4-486c-91b2-0a7208f030a9","Type":"ContainerDied","Data":"a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7"} Apr 20 16:29:37.791781 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.791734 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57c252eb17e851d1671761dd50b6044f743de717d6a79f857588c2c38f57be7" Apr 20 16:29:37.791781 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:37.791766 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c98sx2k" Apr 20 16:29:38.796589 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:38.796553 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" event={"ID":"6011a6ba-7e4c-4e6b-be9c-31101383f90d","Type":"ContainerStarted","Data":"b605944c65f25757362b95b3e98881911ce67f72832687c353607aeee4f60d5c"} Apr 20 16:29:38.797037 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:38.796765 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:38.824848 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:38.824787 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" podStartSLOduration=2.386880199 podStartE2EDuration="4.824770352s" podCreationTimestamp="2026-04-20 16:29:34 +0000 UTC" firstStartedPulling="2026-04-20 16:29:35.428961006 +0000 UTC m=+378.549137504" lastFinishedPulling="2026-04-20 16:29:37.86685116 +0000 UTC m=+380.987027657" observedRunningTime="2026-04-20 16:29:38.824043763 +0000 UTC m=+381.944220283" watchObservedRunningTime="2026-04-20 16:29:38.824770352 +0000 UTC m=+381.944946872" Apr 20 16:29:44.795721 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.795676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb"] Apr 20 16:29:44.796317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796286 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="pull" Apr 20 16:29:44.796317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796309 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="pull" Apr 20 16:29:44.796457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796390 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="util" Apr 20 16:29:44.796457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796403 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="util" Apr 20 16:29:44.796457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796413 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="extract" Apr 20 16:29:44.796457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796422 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="extract" Apr 20 16:29:44.796639 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.796539 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="027de419-7fd4-486c-91b2-0a7208f030a9" containerName="extract" Apr 20 16:29:44.799378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.799355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.803622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803600 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 16:29:44.803622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 16:29:44.803783 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803630 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 16:29:44.803783 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803611 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qs2rf\"" Apr 20 16:29:44.803783 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803661 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:29:44.803783 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.803733 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 16:29:44.808618 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.808598 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb"] Apr 20 16:29:44.859259 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.859218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-manager-config\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.859415 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.859264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.859415 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.859389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zkdk\" (UniqueName: \"kubernetes.io/projected/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-kube-api-access-4zkdk\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.859493 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.859434 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.960274 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.960238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zkdk\" (UniqueName: \"kubernetes.io/projected/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-kube-api-access-4zkdk\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.960410 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.960284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.960410 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.960319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-manager-config\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.960410 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.960334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.960954 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.960924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-manager-config\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.962844 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.962823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.962944 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.962884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:44.969365 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:44.969344 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zkdk\" (UniqueName: \"kubernetes.io/projected/300ed2fb-cfbe-42fd-907a-bc5cfd2dff10-kube-api-access-4zkdk\") pod \"lws-controller-manager-d6fdb785c-jbqsb\" (UID: \"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:45.109697 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:45.109664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:45.235814 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:45.235785 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb"] Apr 20 16:29:45.237742 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:45.237710 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300ed2fb_cfbe_42fd_907a_bc5cfd2dff10.slice/crio-77f40566481e53195891cc5a6cb6bc3b8b2850e429c6470cefa45cd6690f4ccc WatchSource:0}: Error finding container 77f40566481e53195891cc5a6cb6bc3b8b2850e429c6470cefa45cd6690f4ccc: Status 404 returned error can't find the container with id 77f40566481e53195891cc5a6cb6bc3b8b2850e429c6470cefa45cd6690f4ccc Apr 20 16:29:45.823852 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:45.823813 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" event={"ID":"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10","Type":"ContainerStarted","Data":"77f40566481e53195891cc5a6cb6bc3b8b2850e429c6470cefa45cd6690f4ccc"} Apr 20 16:29:47.835305 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:47.835262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" event={"ID":"300ed2fb-cfbe-42fd-907a-bc5cfd2dff10","Type":"ContainerStarted","Data":"affb551548637f11057e8ab18830c2563b9c866c34198e8ff41282a64c28dc87"} Apr 20 16:29:47.835681 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:47.835354 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:29:47.853191 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:47.853126 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" podStartSLOduration=2.152184835 podStartE2EDuration="3.853110735s" podCreationTimestamp="2026-04-20 16:29:44 +0000 UTC" firstStartedPulling="2026-04-20 16:29:45.239537634 +0000 UTC m=+388.359714131" lastFinishedPulling="2026-04-20 16:29:46.940463525 +0000 UTC m=+390.060640031" observedRunningTime="2026-04-20 16:29:47.851349435 +0000 UTC m=+390.971525959" watchObservedRunningTime="2026-04-20 16:29:47.853110735 +0000 UTC m=+390.973287254" Apr 20 16:29:49.803301 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:49.803272 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-vdbl8" Apr 20 16:29:51.820584 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.820544 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7"] Apr 20 16:29:51.823279 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.823260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:51.826073 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.826049 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:29:51.827437 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.827388 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:29:51.827437 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.827399 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:29:51.834839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.834820 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7"] Apr 20 16:29:51.922382 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.922349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmx4t\" (UniqueName: \"kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:51.922561 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.922421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:51.922561 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:51.922458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.023064 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.023030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmx4t\" (UniqueName: \"kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.023299 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.023130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.023299 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.023187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.023533 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.023510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.023598 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.023547 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.032801 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.032772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmx4t\" (UniqueName: \"kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.133428 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.133397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:52.267315 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.267285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7"] Apr 20 16:29:52.269743 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:52.269717 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b207d99_ee15_448d_bf75_2e54b400dcfc.slice/crio-7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b WatchSource:0}: Error finding container 7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b: Status 404 returned error can't find the container with id 7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b Apr 20 16:29:52.430056 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.429977 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb"] Apr 20 16:29:52.432934 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.432908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.435676 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.435653 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 16:29:52.435676 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.435669 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8vzjl\"" Apr 20 16:29:52.435904 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.435661 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 16:29:52.442271 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.442250 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb"] Apr 20 16:29:52.527819 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.527788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19c4538a-4617-43a8-ac59-14dda186c360-tmp\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.527995 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.527843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19c4538a-4617-43a8-ac59-14dda186c360-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.527995 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.527872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzwkk\" (UniqueName: \"kubernetes.io/projected/19c4538a-4617-43a8-ac59-14dda186c360-kube-api-access-hzwkk\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.628836 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.628792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19c4538a-4617-43a8-ac59-14dda186c360-tmp\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.629014 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.628861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19c4538a-4617-43a8-ac59-14dda186c360-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.629014 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.628890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzwkk\" (UniqueName: \"kubernetes.io/projected/19c4538a-4617-43a8-ac59-14dda186c360-kube-api-access-hzwkk\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.631146 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.631119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19c4538a-4617-43a8-ac59-14dda186c360-tmp\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.631378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.631360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/19c4538a-4617-43a8-ac59-14dda186c360-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.637805 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.637786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzwkk\" (UniqueName: \"kubernetes.io/projected/19c4538a-4617-43a8-ac59-14dda186c360-kube-api-access-hzwkk\") pod \"kube-auth-proxy-7b9c9c888c-nlwxb\" (UID: \"19c4538a-4617-43a8-ac59-14dda186c360\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.743831 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.743743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" Apr 20 16:29:52.856285 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.856248 2571 generic.go:358] "Generic (PLEG): container finished" podID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerID="b6a4f15f4dacca10b0d2b187aeecebe2b5489d5e5c57f8b84cccf288fae36168" exitCode=0 Apr 20 16:29:52.856624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.856324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" event={"ID":"4b207d99-ee15-448d-bf75-2e54b400dcfc","Type":"ContainerDied","Data":"b6a4f15f4dacca10b0d2b187aeecebe2b5489d5e5c57f8b84cccf288fae36168"} Apr 20 16:29:52.856624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.856363 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" event={"ID":"4b207d99-ee15-448d-bf75-2e54b400dcfc","Type":"ContainerStarted","Data":"7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b"} Apr 20 16:29:52.868378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:52.868358 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb"] Apr 20 16:29:52.870622 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:29:52.870591 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c4538a_4617_43a8_ac59_14dda186c360.slice/crio-ea920a775fac2cc0500ba3c6ccb894f0baddfa63ad80153fc86a9e2ccbc44c01 WatchSource:0}: Error finding container ea920a775fac2cc0500ba3c6ccb894f0baddfa63ad80153fc86a9e2ccbc44c01: Status 404 returned error can't find the container with id ea920a775fac2cc0500ba3c6ccb894f0baddfa63ad80153fc86a9e2ccbc44c01 Apr 20 16:29:53.862907 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:53.862874 2571 generic.go:358] "Generic (PLEG): container finished" podID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerID="ac58d4b53557bbcb6a02c2bc6bf5835b1f206a98819b1ec8ec5e1d815dc3b2ae" exitCode=0 Apr 20 16:29:53.863339 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:53.862981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" event={"ID":"4b207d99-ee15-448d-bf75-2e54b400dcfc","Type":"ContainerDied","Data":"ac58d4b53557bbcb6a02c2bc6bf5835b1f206a98819b1ec8ec5e1d815dc3b2ae"} Apr 20 16:29:53.865032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:53.864982 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" event={"ID":"19c4538a-4617-43a8-ac59-14dda186c360","Type":"ContainerStarted","Data":"ea920a775fac2cc0500ba3c6ccb894f0baddfa63ad80153fc86a9e2ccbc44c01"} Apr 20 16:29:54.870431 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:54.870397 2571 generic.go:358] "Generic (PLEG): container finished" podID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerID="cb5a31509be00a1e32261b86ec10adfa4e722eb58aef6eeb161aab8fdd908c0e" exitCode=0 Apr 20 16:29:54.870811 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:54.870490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" event={"ID":"4b207d99-ee15-448d-bf75-2e54b400dcfc","Type":"ContainerDied","Data":"cb5a31509be00a1e32261b86ec10adfa4e722eb58aef6eeb161aab8fdd908c0e"} Apr 20 16:29:56.242364 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.242335 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:56.362719 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.362686 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle\") pod \"4b207d99-ee15-448d-bf75-2e54b400dcfc\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " Apr 20 16:29:56.362876 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.362736 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmx4t\" (UniqueName: \"kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t\") pod \"4b207d99-ee15-448d-bf75-2e54b400dcfc\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " Apr 20 16:29:56.362876 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.362753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util\") pod \"4b207d99-ee15-448d-bf75-2e54b400dcfc\" (UID: \"4b207d99-ee15-448d-bf75-2e54b400dcfc\") " Apr 20 16:29:56.363866 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.363821 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle" (OuterVolumeSpecName: "bundle") pod "4b207d99-ee15-448d-bf75-2e54b400dcfc" (UID: "4b207d99-ee15-448d-bf75-2e54b400dcfc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:56.365041 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.365012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t" (OuterVolumeSpecName: "kube-api-access-gmx4t") pod "4b207d99-ee15-448d-bf75-2e54b400dcfc" (UID: "4b207d99-ee15-448d-bf75-2e54b400dcfc"). InnerVolumeSpecName "kube-api-access-gmx4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:29:56.367441 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.367403 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util" (OuterVolumeSpecName: "util") pod "4b207d99-ee15-448d-bf75-2e54b400dcfc" (UID: "4b207d99-ee15-448d-bf75-2e54b400dcfc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:29:56.463692 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.463656 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:56.463692 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.463687 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmx4t\" (UniqueName: \"kubernetes.io/projected/4b207d99-ee15-448d-bf75-2e54b400dcfc-kube-api-access-gmx4t\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:56.463887 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.463700 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b207d99-ee15-448d-bf75-2e54b400dcfc-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:29:56.881439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.881404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" event={"ID":"4b207d99-ee15-448d-bf75-2e54b400dcfc","Type":"ContainerDied","Data":"7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b"} Apr 20 16:29:56.881439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.881442 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e78b8e495a172bf6f31d01528522c9c71c83ef56794d68086a0d60d783b193b" Apr 20 16:29:56.881439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.881448 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835q9kh7" Apr 20 16:29:56.882840 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:56.882812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" event={"ID":"19c4538a-4617-43a8-ac59-14dda186c360","Type":"ContainerStarted","Data":"3f2623f20d9be0a10ec32a863772a9ea0e73d228663df9510abb5d697bc13e5a"} Apr 20 16:29:57.282851 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:57.282753 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-nlwxb" podStartSLOduration=1.8962271670000002 podStartE2EDuration="5.282737938s" podCreationTimestamp="2026-04-20 16:29:52 +0000 UTC" firstStartedPulling="2026-04-20 16:29:52.872430415 +0000 UTC m=+395.992606915" lastFinishedPulling="2026-04-20 16:29:56.258941188 +0000 UTC m=+399.379117686" observedRunningTime="2026-04-20 16:29:56.898561368 +0000 UTC m=+400.018737888" watchObservedRunningTime="2026-04-20 16:29:57.282737938 +0000 UTC m=+400.402914457" Apr 20 16:29:58.842003 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:29:58.841973 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-jbqsb" Apr 20 16:30:05.570717 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.570681 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv"] Apr 20 16:30:05.571219 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571197 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="util" Apr 20 16:30:05.571219 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571215 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="util" Apr 20 16:30:05.571349 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571230 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="extract" Apr 20 16:30:05.571349 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571239 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="extract" Apr 20 16:30:05.571349 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571261 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="pull" Apr 20 16:30:05.571349 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571273 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="pull" Apr 20 16:30:05.571526 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.571384 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b207d99-ee15-448d-bf75-2e54b400dcfc" containerName="extract" Apr 20 16:30:05.573611 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.573589 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.577400 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.577377 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:30:05.577525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.577478 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:30:05.578824 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.578809 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:30:05.592533 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.592505 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv"] Apr 20 16:30:05.745741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.745703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52fv7\" (UniqueName: \"kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.745899 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.745758 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.745899 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.745786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.847321 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.847213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52fv7\" (UniqueName: \"kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.847321 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.847297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.847321 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.847325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.847672 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.847655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.847710 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.847688 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.866920 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.866893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52fv7\" (UniqueName: \"kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:05.883698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:05.883673 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:06.013498 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:06.013467 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv"] Apr 20 16:30:06.015282 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:30:06.015253 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3917095c_344c_41f6_8a8f_d553dc3a2838.slice/crio-2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090 WatchSource:0}: Error finding container 2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090: Status 404 returned error can't find the container with id 2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090 Apr 20 16:30:06.927956 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:06.927919 2571 generic.go:358] "Generic (PLEG): container finished" podID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerID="231bb60aed87b89f740fc4a550547dd90c4d99479b147eb872b3ab1727e20be5" exitCode=0 Apr 20 16:30:06.928366 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:06.928010 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" event={"ID":"3917095c-344c-41f6-8a8f-d553dc3a2838","Type":"ContainerDied","Data":"231bb60aed87b89f740fc4a550547dd90c4d99479b147eb872b3ab1727e20be5"} Apr 20 16:30:06.928366 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:06.928052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" event={"ID":"3917095c-344c-41f6-8a8f-d553dc3a2838","Type":"ContainerStarted","Data":"2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090"} Apr 20 16:30:07.933194 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:07.933140 2571 generic.go:358] "Generic (PLEG): container finished" podID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerID="5148f876fc4701b57de3cbb13936d208eed8dd8fee67bd10f366544225195b44" exitCode=0 Apr 20 16:30:07.933589 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:07.933238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" event={"ID":"3917095c-344c-41f6-8a8f-d553dc3a2838","Type":"ContainerDied","Data":"5148f876fc4701b57de3cbb13936d208eed8dd8fee67bd10f366544225195b44"} Apr 20 16:30:08.939223 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:08.939181 2571 generic.go:358] "Generic (PLEG): container finished" podID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerID="5ca86418a8a0cb8fe5ef1a259373fe024cf544cd95ba21ab3f051c46c2e125a8" exitCode=0 Apr 20 16:30:08.939598 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:08.939264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" event={"ID":"3917095c-344c-41f6-8a8f-d553dc3a2838","Type":"ContainerDied","Data":"5ca86418a8a0cb8fe5ef1a259373fe024cf544cd95ba21ab3f051c46c2e125a8"} Apr 20 16:30:10.070881 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.070854 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:10.188439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.188404 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle\") pod \"3917095c-344c-41f6-8a8f-d553dc3a2838\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " Apr 20 16:30:10.188594 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.188458 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52fv7\" (UniqueName: \"kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7\") pod \"3917095c-344c-41f6-8a8f-d553dc3a2838\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " Apr 20 16:30:10.188594 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.188483 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util\") pod \"3917095c-344c-41f6-8a8f-d553dc3a2838\" (UID: \"3917095c-344c-41f6-8a8f-d553dc3a2838\") " Apr 20 16:30:10.189378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.189352 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle" (OuterVolumeSpecName: "bundle") pod "3917095c-344c-41f6-8a8f-d553dc3a2838" (UID: "3917095c-344c-41f6-8a8f-d553dc3a2838"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:30:10.190688 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.190662 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7" (OuterVolumeSpecName: "kube-api-access-52fv7") pod "3917095c-344c-41f6-8a8f-d553dc3a2838" (UID: "3917095c-344c-41f6-8a8f-d553dc3a2838"). InnerVolumeSpecName "kube-api-access-52fv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:30:10.193709 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.193656 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util" (OuterVolumeSpecName: "util") pod "3917095c-344c-41f6-8a8f-d553dc3a2838" (UID: "3917095c-344c-41f6-8a8f-d553dc3a2838"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:30:10.289698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.289660 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:30:10.289698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.289692 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52fv7\" (UniqueName: \"kubernetes.io/projected/3917095c-344c-41f6-8a8f-d553dc3a2838-kube-api-access-52fv7\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:30:10.289698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.289703 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3917095c-344c-41f6-8a8f-d553dc3a2838-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:30:10.948124 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.948096 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" Apr 20 16:30:10.948390 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.948092 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rhsdv" event={"ID":"3917095c-344c-41f6-8a8f-d553dc3a2838","Type":"ContainerDied","Data":"2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090"} Apr 20 16:30:10.948390 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:30:10.948204 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5d3235dd3df3aa2c1f7c5a6b137bd96b63a8f7d41fb90b55dbe6c5dcfa6090" Apr 20 16:31:08.100176 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100138 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl"] Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100547 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="pull" Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100566 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="pull" Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100582 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="util" Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100592 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="util" Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100608 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="extract" Apr 20 16:31:08.100671 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100617 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="extract" Apr 20 16:31:08.100865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.100738 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3917095c-344c-41f6-8a8f-d553dc3a2838" containerName="extract" Apr 20 16:31:08.102877 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.102860 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.105311 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.105289 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 16:31:08.106366 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.106351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-fc88d\"" Apr 20 16:31:08.106593 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.106381 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 16:31:08.109926 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.109895 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl"] Apr 20 16:31:08.279256 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.279219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtttb\" (UniqueName: \"kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.279425 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.279279 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.279425 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.279329 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.380554 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.380514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.380747 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.380607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtttb\" (UniqueName: \"kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.380747 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.380662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.380871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.380851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.381006 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.380986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.389966 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.389945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtttb\" (UniqueName: \"kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.413694 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.413667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:08.547590 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.547559 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl"] Apr 20 16:31:08.550677 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:31:08.550646 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e709f1_04d6_4de7_b38d_80398e1f8255.slice/crio-cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d WatchSource:0}: Error finding container cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d: Status 404 returned error can't find the container with id cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d Apr 20 16:31:08.901004 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.900923 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2"] Apr 20 16:31:08.903545 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.903529 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:08.912544 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:08.912520 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2"] Apr 20 16:31:09.085602 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.085564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.085807 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.085616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29hn\" (UniqueName: \"kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.085807 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.085690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.157466 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.157367 2571 generic.go:358] "Generic (PLEG): container finished" podID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerID="3c28fa98248e254e9e03c6e0a1de69bcac7e628f70daf855c88423cde959739a" exitCode=0 Apr 20 16:31:09.157858 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.157461 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" event={"ID":"05e709f1-04d6-4de7-b38d-80398e1f8255","Type":"ContainerDied","Data":"3c28fa98248e254e9e03c6e0a1de69bcac7e628f70daf855c88423cde959739a"} Apr 20 16:31:09.157858 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.157507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" event={"ID":"05e709f1-04d6-4de7-b38d-80398e1f8255","Type":"ContainerStarted","Data":"cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d"} Apr 20 16:31:09.186634 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.186601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.186788 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.186639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j29hn\" (UniqueName: \"kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.186788 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.186767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.186981 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.186963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.187101 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.187083 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.195007 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.194975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29hn\" (UniqueName: \"kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.213891 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.213870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:09.298245 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.298214 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr"] Apr 20 16:31:09.304286 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.304262 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.308726 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.308700 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr"] Apr 20 16:31:09.341522 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.341501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2"] Apr 20 16:31:09.343441 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:31:09.343417 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f9c4347_3ae4_48ba_9142_9aa5ce8df783.slice/crio-b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114 WatchSource:0}: Error finding container b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114: Status 404 returned error can't find the container with id b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114 Apr 20 16:31:09.488531 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.488492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.488720 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.488536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.488720 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.488566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkzhd\" (UniqueName: \"kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.589439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.589398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.589439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.589441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.589700 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.589463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkzhd\" (UniqueName: \"kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.589797 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.589777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.589854 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.589809 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.598786 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.598755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkzhd\" (UniqueName: \"kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.616836 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.616815 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:09.704641 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.704605 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd"] Apr 20 16:31:09.709637 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.709612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.714430 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.714403 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd"] Apr 20 16:31:09.757274 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.757232 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr"] Apr 20 16:31:09.758627 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:31:09.758602 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860a5367_7536_4db2_969c_d079797aa3e3.slice/crio-24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6 WatchSource:0}: Error finding container 24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6: Status 404 returned error can't find the container with id 24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6 Apr 20 16:31:09.892690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.892658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.892818 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.892741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.892818 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.892766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbbsj\" (UniqueName: \"kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.993731 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.993656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.993856 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.993734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.993856 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.993758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbbsj\" (UniqueName: \"kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.994146 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.994122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:09.994215 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:09.994132 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:10.002852 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.002824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbbsj\" (UniqueName: \"kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:10.023199 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.023176 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:10.162521 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.162492 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd"] Apr 20 16:31:10.164237 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.164212 2571 generic.go:358] "Generic (PLEG): container finished" podID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerID="c08f65c451b1991cd2fe5953d9d53724d9d78cdd9ba15b3e9f0415ce05c5f362" exitCode=0 Apr 20 16:31:10.164352 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.164296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" event={"ID":"05e709f1-04d6-4de7-b38d-80398e1f8255","Type":"ContainerDied","Data":"c08f65c451b1991cd2fe5953d9d53724d9d78cdd9ba15b3e9f0415ce05c5f362"} Apr 20 16:31:10.165670 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.165651 2571 generic.go:358] "Generic (PLEG): container finished" podID="860a5367-7536-4db2-969c-d079797aa3e3" containerID="dfbea2296d0cf728b34d63bd8eaf6a165893d67516c38b2581cabd0f169063c7" exitCode=0 Apr 20 16:31:10.165769 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.165736 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" event={"ID":"860a5367-7536-4db2-969c-d079797aa3e3","Type":"ContainerDied","Data":"dfbea2296d0cf728b34d63bd8eaf6a165893d67516c38b2581cabd0f169063c7"} Apr 20 16:31:10.165822 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.165770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" event={"ID":"860a5367-7536-4db2-969c-d079797aa3e3","Type":"ContainerStarted","Data":"24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6"} Apr 20 16:31:10.167528 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.167503 2571 generic.go:358] "Generic (PLEG): container finished" podID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerID="737cf36fa261000e0d3d3496866ff20b3bd68606dc3f24c5bc75cf5ba535024f" exitCode=0 Apr 20 16:31:10.167616 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.167558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerDied","Data":"737cf36fa261000e0d3d3496866ff20b3bd68606dc3f24c5bc75cf5ba535024f"} Apr 20 16:31:10.167616 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:10.167580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerStarted","Data":"b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114"} Apr 20 16:31:10.173098 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:31:10.173076 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be823da_8d73_4baf_8a47_9c0093d9e2bf.slice/crio-1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6 WatchSource:0}: Error finding container 1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6: Status 404 returned error can't find the container with id 1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6 Apr 20 16:31:11.172882 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.172850 2571 generic.go:358] "Generic (PLEG): container finished" podID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerID="01c19cb16b0b5c0af235dee5544ef9b66bdc6240b27589dbc545f8bce4e4ade3" exitCode=0 Apr 20 16:31:11.173348 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.172937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" event={"ID":"05e709f1-04d6-4de7-b38d-80398e1f8255","Type":"ContainerDied","Data":"01c19cb16b0b5c0af235dee5544ef9b66bdc6240b27589dbc545f8bce4e4ade3"} Apr 20 16:31:11.174802 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.174777 2571 generic.go:358] "Generic (PLEG): container finished" podID="860a5367-7536-4db2-969c-d079797aa3e3" containerID="a4db1760a946f158caadf2f93dd646c059aa54aece1078e70eb02fdb8a33f285" exitCode=0 Apr 20 16:31:11.174901 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.174864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" event={"ID":"860a5367-7536-4db2-969c-d079797aa3e3","Type":"ContainerDied","Data":"a4db1760a946f158caadf2f93dd646c059aa54aece1078e70eb02fdb8a33f285"} Apr 20 16:31:11.176383 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.176347 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerID="35cb29c397f42af335ff1ebece23f536d2b020d54be91d13d53b934c44758b35" exitCode=0 Apr 20 16:31:11.176480 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.176377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" event={"ID":"1be823da-8d73-4baf-8a47-9c0093d9e2bf","Type":"ContainerDied","Data":"35cb29c397f42af335ff1ebece23f536d2b020d54be91d13d53b934c44758b35"} Apr 20 16:31:11.176480 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.176408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" event={"ID":"1be823da-8d73-4baf-8a47-9c0093d9e2bf","Type":"ContainerStarted","Data":"1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6"} Apr 20 16:31:11.178532 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:11.178511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerStarted","Data":"5bd73ae4fffca2b7bfc79ca29f22d081f85209735aebb41f8a7ce7d6c57b2062"} Apr 20 16:31:12.184131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.184092 2571 generic.go:358] "Generic (PLEG): container finished" podID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerID="5bd73ae4fffca2b7bfc79ca29f22d081f85209735aebb41f8a7ce7d6c57b2062" exitCode=0 Apr 20 16:31:12.184574 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.184201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerDied","Data":"5bd73ae4fffca2b7bfc79ca29f22d081f85209735aebb41f8a7ce7d6c57b2062"} Apr 20 16:31:12.186277 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.186243 2571 generic.go:358] "Generic (PLEG): container finished" podID="860a5367-7536-4db2-969c-d079797aa3e3" containerID="afa6813256ac68f0930408c74acfd523839c673660df7d6c94de0b797c76eaa0" exitCode=0 Apr 20 16:31:12.186378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.186301 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" event={"ID":"860a5367-7536-4db2-969c-d079797aa3e3","Type":"ContainerDied","Data":"afa6813256ac68f0930408c74acfd523839c673660df7d6c94de0b797c76eaa0"} Apr 20 16:31:12.187868 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.187842 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerID="90c418e85005555cf6ba33acd1ea721aa224635c50b862b344c536901a7e8792" exitCode=0 Apr 20 16:31:12.187958 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.187927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" event={"ID":"1be823da-8d73-4baf-8a47-9c0093d9e2bf","Type":"ContainerDied","Data":"90c418e85005555cf6ba33acd1ea721aa224635c50b862b344c536901a7e8792"} Apr 20 16:31:12.320862 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.320840 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:12.418525 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.418491 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtttb\" (UniqueName: \"kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb\") pod \"05e709f1-04d6-4de7-b38d-80398e1f8255\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " Apr 20 16:31:12.418653 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.418542 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util\") pod \"05e709f1-04d6-4de7-b38d-80398e1f8255\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " Apr 20 16:31:12.418653 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.418605 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle\") pod \"05e709f1-04d6-4de7-b38d-80398e1f8255\" (UID: \"05e709f1-04d6-4de7-b38d-80398e1f8255\") " Apr 20 16:31:12.419074 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.419036 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle" (OuterVolumeSpecName: "bundle") pod "05e709f1-04d6-4de7-b38d-80398e1f8255" (UID: "05e709f1-04d6-4de7-b38d-80398e1f8255"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:12.420887 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.420863 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb" (OuterVolumeSpecName: "kube-api-access-qtttb") pod "05e709f1-04d6-4de7-b38d-80398e1f8255" (UID: "05e709f1-04d6-4de7-b38d-80398e1f8255"). InnerVolumeSpecName "kube-api-access-qtttb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:12.424584 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.424560 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util" (OuterVolumeSpecName: "util") pod "05e709f1-04d6-4de7-b38d-80398e1f8255" (UID: "05e709f1-04d6-4de7-b38d-80398e1f8255"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:12.519992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.519899 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:12.519992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.519935 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtttb\" (UniqueName: \"kubernetes.io/projected/05e709f1-04d6-4de7-b38d-80398e1f8255-kube-api-access-qtttb\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:12.519992 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:12.519948 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e709f1-04d6-4de7-b38d-80398e1f8255-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:13.193665 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.193629 2571 generic.go:358] "Generic (PLEG): container finished" podID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerID="a0245da3cefc6d28d4b2f2ab0c8a7aec4a720e9476f595111d0c0f0d02203b57" exitCode=0 Apr 20 16:31:13.194107 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.193713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerDied","Data":"a0245da3cefc6d28d4b2f2ab0c8a7aec4a720e9476f595111d0c0f0d02203b57"} Apr 20 16:31:13.195417 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.195394 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" Apr 20 16:31:13.195547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.195391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl" event={"ID":"05e709f1-04d6-4de7-b38d-80398e1f8255","Type":"ContainerDied","Data":"cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d"} Apr 20 16:31:13.195547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.195496 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf8d177e666845f24445329a0fedb84969ab27fb796f95793e08ad5add35c19d" Apr 20 16:31:13.197232 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.197209 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerID="dc39065422ba03332fdd5d41759c61d1796bf330734a42b878605d50ac0bbfa2" exitCode=0 Apr 20 16:31:13.197358 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.197281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" event={"ID":"1be823da-8d73-4baf-8a47-9c0093d9e2bf","Type":"ContainerDied","Data":"dc39065422ba03332fdd5d41759c61d1796bf330734a42b878605d50ac0bbfa2"} Apr 20 16:31:13.330481 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.330457 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:13.427067 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.427036 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util\") pod \"860a5367-7536-4db2-969c-d079797aa3e3\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " Apr 20 16:31:13.427269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.427078 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle\") pod \"860a5367-7536-4db2-969c-d079797aa3e3\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " Apr 20 16:31:13.427269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.427152 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkzhd\" (UniqueName: \"kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd\") pod \"860a5367-7536-4db2-969c-d079797aa3e3\" (UID: \"860a5367-7536-4db2-969c-d079797aa3e3\") " Apr 20 16:31:13.427705 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.427676 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle" (OuterVolumeSpecName: "bundle") pod "860a5367-7536-4db2-969c-d079797aa3e3" (UID: "860a5367-7536-4db2-969c-d079797aa3e3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:13.429440 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.429414 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd" (OuterVolumeSpecName: "kube-api-access-rkzhd") pod "860a5367-7536-4db2-969c-d079797aa3e3" (UID: "860a5367-7536-4db2-969c-d079797aa3e3"). InnerVolumeSpecName "kube-api-access-rkzhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:13.432032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.432006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util" (OuterVolumeSpecName: "util") pod "860a5367-7536-4db2-969c-d079797aa3e3" (UID: "860a5367-7536-4db2-969c-d079797aa3e3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:13.528040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.527943 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkzhd\" (UniqueName: \"kubernetes.io/projected/860a5367-7536-4db2-969c-d079797aa3e3-kube-api-access-rkzhd\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:13.528040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.527978 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:13.528040 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:13.527988 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860a5367-7536-4db2-969c-d079797aa3e3-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.202898 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.202866 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" Apr 20 16:31:14.202898 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.202867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr" event={"ID":"860a5367-7536-4db2-969c-d079797aa3e3","Type":"ContainerDied","Data":"24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6"} Apr 20 16:31:14.203311 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.202907 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24bbe4277b5ba60da47dd5f1495e943956934443b5a95661c951f8af90cdb7f6" Apr 20 16:31:14.367636 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.367615 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:14.371769 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.371748 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:14.442506 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442476 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle\") pod \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " Apr 20 16:31:14.442683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442513 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util\") pod \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " Apr 20 16:31:14.442683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442540 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle\") pod \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " Apr 20 16:31:14.442683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442557 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbbsj\" (UniqueName: \"kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj\") pod \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " Apr 20 16:31:14.442683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442612 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29hn\" (UniqueName: \"kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn\") pod \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\" (UID: \"0f9c4347-3ae4-48ba-9142-9aa5ce8df783\") " Apr 20 16:31:14.442683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.442627 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util\") pod \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\" (UID: \"1be823da-8d73-4baf-8a47-9c0093d9e2bf\") " Apr 20 16:31:14.443109 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.443081 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle" (OuterVolumeSpecName: "bundle") pod "0f9c4347-3ae4-48ba-9142-9aa5ce8df783" (UID: "0f9c4347-3ae4-48ba-9142-9aa5ce8df783"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:14.443262 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.443198 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle" (OuterVolumeSpecName: "bundle") pod "1be823da-8d73-4baf-8a47-9c0093d9e2bf" (UID: "1be823da-8d73-4baf-8a47-9c0093d9e2bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:14.445436 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.445414 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn" (OuterVolumeSpecName: "kube-api-access-j29hn") pod "0f9c4347-3ae4-48ba-9142-9aa5ce8df783" (UID: "0f9c4347-3ae4-48ba-9142-9aa5ce8df783"). InnerVolumeSpecName "kube-api-access-j29hn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:14.445436 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.445427 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj" (OuterVolumeSpecName: "kube-api-access-jbbsj") pod "1be823da-8d73-4baf-8a47-9c0093d9e2bf" (UID: "1be823da-8d73-4baf-8a47-9c0093d9e2bf"). InnerVolumeSpecName "kube-api-access-jbbsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:14.451153 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.451125 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util" (OuterVolumeSpecName: "util") pod "1be823da-8d73-4baf-8a47-9c0093d9e2bf" (UID: "1be823da-8d73-4baf-8a47-9c0093d9e2bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:14.451541 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.451521 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util" (OuterVolumeSpecName: "util") pod "0f9c4347-3ae4-48ba-9142-9aa5ce8df783" (UID: "0f9c4347-3ae4-48ba-9142-9aa5ce8df783"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543528 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543559 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543569 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jbbsj\" (UniqueName: \"kubernetes.io/projected/1be823da-8d73-4baf-8a47-9c0093d9e2bf-kube-api-access-jbbsj\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543580 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j29hn\" (UniqueName: \"kubernetes.io/projected/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-kube-api-access-j29hn\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543590 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1be823da-8d73-4baf-8a47-9c0093d9e2bf-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:14.543613 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:14.543599 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f9c4347-3ae4-48ba-9142-9aa5ce8df783-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:31:15.208351 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.208323 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" Apr 20 16:31:15.208761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.208327 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd" event={"ID":"1be823da-8d73-4baf-8a47-9c0093d9e2bf","Type":"ContainerDied","Data":"1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6"} Apr 20 16:31:15.208761 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.208438 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1373902a9896efdbd439285cc4b3747c0a9e386ccd35e719ebba7c600fc9a0b6" Apr 20 16:31:15.210131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.210112 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" Apr 20 16:31:15.210131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.210127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2" event={"ID":"0f9c4347-3ae4-48ba-9142-9aa5ce8df783","Type":"ContainerDied","Data":"b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114"} Apr 20 16:31:15.210305 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:15.210149 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7a079033cd2c9e400872a297c06034fb3094ac71c3d171796460301cb19b114" Apr 20 16:31:39.160799 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.160764 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n"] Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161122 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161132 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161147 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161153 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161176 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161181 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161187 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161192 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161198 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161203 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161212 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161217 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="extract" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161223 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161228 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161235 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161242 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161247 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161252 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161258 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="pull" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161272 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="util" Apr 20 16:31:39.161269 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161276 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="util" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161285 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="extract" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161290 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="extract" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161345 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="05e709f1-04d6-4de7-b38d-80398e1f8255" containerName="extract" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161353 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f9c4347-3ae4-48ba-9142-9aa5ce8df783" containerName="extract" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161359 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1be823da-8d73-4baf-8a47-9c0093d9e2bf" containerName="extract" Apr 20 16:31:39.161952 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.161366 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="860a5367-7536-4db2-969c-d079797aa3e3" containerName="extract" Apr 20 16:31:39.164347 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.164331 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.168587 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.168484 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-fc88d\"" Apr 20 16:31:39.168710 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.168591 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 16:31:39.168710 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.168591 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 16:31:39.169815 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.169798 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 16:31:39.169885 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.169866 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 16:31:39.177617 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.177595 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n"] Apr 20 16:31:39.272357 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.272319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkcp\" (UniqueName: \"kubernetes.io/projected/f1716bcd-859c-4a38-b028-6564913bbe3d-kube-api-access-mhkcp\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.272558 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.272372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1716bcd-859c-4a38-b028-6564913bbe3d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.272558 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.272421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.372920 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.372872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.373128 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.372962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkcp\" (UniqueName: \"kubernetes.io/projected/f1716bcd-859c-4a38-b028-6564913bbe3d-kube-api-access-mhkcp\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.373128 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.372996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1716bcd-859c-4a38-b028-6564913bbe3d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.373128 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:31:39.372993 2571 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 16:31:39.373128 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:31:39.373094 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert podName:f1716bcd-859c-4a38-b028-6564913bbe3d nodeName:}" failed. No retries permitted until 2026-04-20 16:31:39.873073116 +0000 UTC m=+502.993249617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-5x88n" (UID: "f1716bcd-859c-4a38-b028-6564913bbe3d") : secret "plugin-serving-cert" not found Apr 20 16:31:39.373770 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.373747 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1716bcd-859c-4a38-b028-6564913bbe3d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.391154 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.391119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkcp\" (UniqueName: \"kubernetes.io/projected/f1716bcd-859c-4a38-b028-6564913bbe3d-kube-api-access-mhkcp\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.877563 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.877522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:39.880180 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:39.880142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1716bcd-859c-4a38-b028-6564913bbe3d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-5x88n\" (UID: \"f1716bcd-859c-4a38-b028-6564913bbe3d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:40.074374 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:40.074336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" Apr 20 16:31:40.203355 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:40.203319 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n"] Apr 20 16:31:40.205998 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:31:40.205967 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1716bcd_859c_4a38_b028_6564913bbe3d.slice/crio-21f12db34f79a6361eb2c2d9067f7c6eebb7cd270f5373f840b4419b2cd80873 WatchSource:0}: Error finding container 21f12db34f79a6361eb2c2d9067f7c6eebb7cd270f5373f840b4419b2cd80873: Status 404 returned error can't find the container with id 21f12db34f79a6361eb2c2d9067f7c6eebb7cd270f5373f840b4419b2cd80873 Apr 20 16:31:40.304587 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:40.304552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" event={"ID":"f1716bcd-859c-4a38-b028-6564913bbe3d","Type":"ContainerStarted","Data":"21f12db34f79a6361eb2c2d9067f7c6eebb7cd270f5373f840b4419b2cd80873"} Apr 20 16:31:44.645784 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:31:44.645743 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:32:06.429279 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:06.429244 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" event={"ID":"f1716bcd-859c-4a38-b028-6564913bbe3d","Type":"ContainerStarted","Data":"e11ed1b50895d2ffeae07b3392ad8483fb04d5ca9a3f1a08f1ed9bda59c27f64"} Apr 20 16:32:06.449856 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:06.449799 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-5x88n" podStartSLOduration=1.971306236 podStartE2EDuration="27.44978271s" podCreationTimestamp="2026-04-20 16:31:39 +0000 UTC" firstStartedPulling="2026-04-20 16:31:40.207297262 +0000 UTC m=+503.327473760" lastFinishedPulling="2026-04-20 16:32:05.685773724 +0000 UTC m=+528.805950234" observedRunningTime="2026-04-20 16:32:06.449269199 +0000 UTC m=+529.569445719" watchObservedRunningTime="2026-04-20 16:32:06.44978271 +0000 UTC m=+529.569959233" Apr 20 16:32:09.676229 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:09.676151 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dfd8478dd-8ng6t" podUID="f8980c74-431a-4289-b5e9-24435034c04a" containerName="console" containerID="cri-o://e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127" gracePeriod=15 Apr 20 16:32:09.949055 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:09.949033 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfd8478dd-8ng6t_f8980c74-431a-4289-b5e9-24435034c04a/console/0.log" Apr 20 16:32:09.949190 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:09.949092 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:32:10.066874 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.066836 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067020 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.066918 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sz4b\" (UniqueName: \"kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067020 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.066949 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067020 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.066988 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067020 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067010 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067026 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067060 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca\") pod \"f8980c74-431a-4289-b5e9-24435034c04a\" (UID: \"f8980c74-431a-4289-b5e9-24435034c04a\") " Apr 20 16:32:10.067503 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067474 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config" (OuterVolumeSpecName: "console-config") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:32:10.067503 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067486 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:32:10.067503 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067493 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca" (OuterVolumeSpecName: "service-ca") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:32:10.067683 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.067513 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:32:10.069537 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.069506 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:32:10.069645 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.069533 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b" (OuterVolumeSpecName: "kube-api-access-6sz4b") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "kube-api-access-6sz4b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:32:10.069645 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.069593 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f8980c74-431a-4289-b5e9-24435034c04a" (UID: "f8980c74-431a-4289-b5e9-24435034c04a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:32:10.168091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168052 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6sz4b\" (UniqueName: \"kubernetes.io/projected/f8980c74-431a-4289-b5e9-24435034c04a-kube-api-access-6sz4b\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168084 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-oauth-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168091 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168095 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-oauth-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168107 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-console-config\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168116 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8980c74-431a-4289-b5e9-24435034c04a-console-serving-cert\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168127 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-service-ca\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.168337 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.168136 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8980c74-431a-4289-b5e9-24435034c04a-trusted-ca-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:32:10.445961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.445928 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfd8478dd-8ng6t_f8980c74-431a-4289-b5e9-24435034c04a/console/0.log" Apr 20 16:32:10.446140 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.445975 2571 generic.go:358] "Generic (PLEG): container finished" podID="f8980c74-431a-4289-b5e9-24435034c04a" containerID="e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127" exitCode=2 Apr 20 16:32:10.446140 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.446113 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfd8478dd-8ng6t" Apr 20 16:32:10.446140 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.446113 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfd8478dd-8ng6t" event={"ID":"f8980c74-431a-4289-b5e9-24435034c04a","Type":"ContainerDied","Data":"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127"} Apr 20 16:32:10.446301 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.446151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfd8478dd-8ng6t" event={"ID":"f8980c74-431a-4289-b5e9-24435034c04a","Type":"ContainerDied","Data":"df76a8dcf474da98b376fabb7931e0bb0faccdff6389a28d908ae862e9ddfa8f"} Apr 20 16:32:10.446301 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.446190 2571 scope.go:117] "RemoveContainer" containerID="e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127" Apr 20 16:32:10.455689 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.455661 2571 scope.go:117] "RemoveContainer" containerID="e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127" Apr 20 16:32:10.456015 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:32:10.455994 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127\": container with ID starting with e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127 not found: ID does not exist" containerID="e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127" Apr 20 16:32:10.456105 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.456022 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127"} err="failed to get container status \"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127\": rpc error: code = NotFound desc = could not find container \"e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127\": container with ID starting with e5c40353fe80bbe85d23f44b2a6d5c072012aff25796704f7a898184edd4b127 not found: ID does not exist" Apr 20 16:32:10.478074 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.478032 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:32:10.480970 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:10.480943 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dfd8478dd-8ng6t"] Apr 20 16:32:11.422980 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:11.422936 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8980c74-431a-4289-b5e9-24435034c04a" path="/var/lib/kubelet/pods/f8980c74-431a-4289-b5e9-24435034c04a/volumes" Apr 20 16:32:37.014808 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.014774 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:32:37.015270 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.015136 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8980c74-431a-4289-b5e9-24435034c04a" containerName="console" Apr 20 16:32:37.015270 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.015147 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8980c74-431a-4289-b5e9-24435034c04a" containerName="console" Apr 20 16:32:37.015270 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.015221 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8980c74-431a-4289-b5e9-24435034c04a" containerName="console" Apr 20 16:32:37.057388 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.057349 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:32:37.057388 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.057389 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:32:37.057598 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.057461 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.060153 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.060128 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 16:32:37.107733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.107699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-config-file\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.107931 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.107810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qkd\" (UniqueName: \"kubernetes.io/projected/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-kube-api-access-56qkd\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.208999 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.208964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56qkd\" (UniqueName: \"kubernetes.io/projected/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-kube-api-access-56qkd\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.209220 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.209057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-config-file\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.209659 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.209640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-config-file\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.217927 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.217895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qkd\" (UniqueName: \"kubernetes.io/projected/e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0-kube-api-access-56qkd\") pod \"limitador-limitador-78c99df468-xqrl5\" (UID: \"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.368067 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.368030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:37.505436 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.505403 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:32:37.508971 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:32:37.508942 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b54d27_da6a_4b7c_be5e_1ccc0a85d3b0.slice/crio-c95984550a4a1b027e3a6f7893624de4e775230d9b09e05c3b11b8d445573bf7 WatchSource:0}: Error finding container c95984550a4a1b027e3a6f7893624de4e775230d9b09e05c3b11b8d445573bf7: Status 404 returned error can't find the container with id c95984550a4a1b027e3a6f7893624de4e775230d9b09e05c3b11b8d445573bf7 Apr 20 16:32:37.551639 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:37.551608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" event={"ID":"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0","Type":"ContainerStarted","Data":"c95984550a4a1b027e3a6f7893624de4e775230d9b09e05c3b11b8d445573bf7"} Apr 20 16:32:40.567215 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:40.567088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" event={"ID":"e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0","Type":"ContainerStarted","Data":"5b76b3f100eed7c0e20978345726d24646cc16d9545103d0d3e72962a3b9fff7"} Apr 20 16:32:40.567643 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:40.567224 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:32:40.586840 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:40.586781 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" podStartSLOduration=1.8501163919999999 podStartE2EDuration="4.586767579s" podCreationTimestamp="2026-04-20 16:32:36 +0000 UTC" firstStartedPulling="2026-04-20 16:32:37.510776772 +0000 UTC m=+560.630953273" lastFinishedPulling="2026-04-20 16:32:40.247427961 +0000 UTC m=+563.367604460" observedRunningTime="2026-04-20 16:32:40.584474477 +0000 UTC m=+563.704650998" watchObservedRunningTime="2026-04-20 16:32:40.586767579 +0000 UTC m=+563.706944136" Apr 20 16:32:51.573490 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:32:51.573454 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-xqrl5" Apr 20 16:33:10.620716 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.620678 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8"] Apr 20 16:33:10.629217 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.629188 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.631406 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.631382 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8"] Apr 20 16:33:10.632089 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.632059 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:33:10.632238 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.632216 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:33:10.633312 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.633293 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:33:10.691082 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.691054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.691251 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.691088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5pkf\" (UniqueName: \"kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.691251 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.691156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.792413 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.792384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.792592 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.792452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.792592 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.792481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5pkf\" (UniqueName: \"kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.792791 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.792769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.792855 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.792835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.801848 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.801825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5pkf\" (UniqueName: \"kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:10.939363 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:10.939275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:11.072724 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:11.072695 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8"] Apr 20 16:33:11.074778 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:33:11.074750 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5620178_1b26_4499_b152_cda62e6136bb.slice/crio-119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f WatchSource:0}: Error finding container 119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f: Status 404 returned error can't find the container with id 119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f Apr 20 16:33:11.685957 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:11.685920 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5620178-1b26-4499-b152-cda62e6136bb" containerID="c004cdac51d80d8a8f047b8c10c092b4cdd7bc98059ce1423bd8d441bafd08fc" exitCode=0 Apr 20 16:33:11.686381 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:11.686008 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" event={"ID":"d5620178-1b26-4499-b152-cda62e6136bb","Type":"ContainerDied","Data":"c004cdac51d80d8a8f047b8c10c092b4cdd7bc98059ce1423bd8d441bafd08fc"} Apr 20 16:33:11.686381 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:11.686043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" event={"ID":"d5620178-1b26-4499-b152-cda62e6136bb","Type":"ContainerStarted","Data":"119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f"} Apr 20 16:33:12.691389 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:12.691346 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5620178-1b26-4499-b152-cda62e6136bb" containerID="d48e425eaa9d008978e818712a46217c6b454b8b72ae2ed4a9b6e2e402152513" exitCode=0 Apr 20 16:33:12.691859 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:12.691405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" event={"ID":"d5620178-1b26-4499-b152-cda62e6136bb","Type":"ContainerDied","Data":"d48e425eaa9d008978e818712a46217c6b454b8b72ae2ed4a9b6e2e402152513"} Apr 20 16:33:13.696741 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:13.696702 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5620178-1b26-4499-b152-cda62e6136bb" containerID="3c875313df7d314833babb6018bc496ca4c47d780d1b149f2119030db2ada143" exitCode=0 Apr 20 16:33:13.697136 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:13.696779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" event={"ID":"d5620178-1b26-4499-b152-cda62e6136bb","Type":"ContainerDied","Data":"3c875313df7d314833babb6018bc496ca4c47d780d1b149f2119030db2ada143"} Apr 20 16:33:14.835499 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.835479 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:14.927343 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.927305 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5pkf\" (UniqueName: \"kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf\") pod \"d5620178-1b26-4499-b152-cda62e6136bb\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " Apr 20 16:33:14.927515 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.927371 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util\") pod \"d5620178-1b26-4499-b152-cda62e6136bb\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " Apr 20 16:33:14.927515 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.927400 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle\") pod \"d5620178-1b26-4499-b152-cda62e6136bb\" (UID: \"d5620178-1b26-4499-b152-cda62e6136bb\") " Apr 20 16:33:14.927919 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.927882 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle" (OuterVolumeSpecName: "bundle") pod "d5620178-1b26-4499-b152-cda62e6136bb" (UID: "d5620178-1b26-4499-b152-cda62e6136bb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:33:14.929586 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.929561 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf" (OuterVolumeSpecName: "kube-api-access-b5pkf") pod "d5620178-1b26-4499-b152-cda62e6136bb" (UID: "d5620178-1b26-4499-b152-cda62e6136bb"). InnerVolumeSpecName "kube-api-access-b5pkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:33:14.932804 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:14.932765 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util" (OuterVolumeSpecName: "util") pod "d5620178-1b26-4499-b152-cda62e6136bb" (UID: "d5620178-1b26-4499-b152-cda62e6136bb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:33:15.029066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.028988 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5pkf\" (UniqueName: \"kubernetes.io/projected/d5620178-1b26-4499-b152-cda62e6136bb-kube-api-access-b5pkf\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:33:15.029066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.029018 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:33:15.029066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.029029 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5620178-1b26-4499-b152-cda62e6136bb-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:33:15.706728 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.706637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" event={"ID":"d5620178-1b26-4499-b152-cda62e6136bb","Type":"ContainerDied","Data":"119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f"} Apr 20 16:33:15.706728 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.706677 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119adee21be624d7afb51c0008e6a4e2c86ac169cf389a88f71d9a96c7d6c28f" Apr 20 16:33:15.706728 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:15.706653 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13509zkt8" Apr 20 16:33:17.350085 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:17.350057 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:33:17.350857 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:33:17.350829 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:34:13.915414 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:13.915371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:34:43.293389 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293358 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6fb96bbb9-9hscp"] Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293734 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="extract" Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293747 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="extract" Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293769 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="util" Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293775 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="util" Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293784 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="pull" Apr 20 16:34:43.293793 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293789 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="pull" Apr 20 16:34:43.293977 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.293849 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5620178-1b26-4499-b152-cda62e6136bb" containerName="extract" Apr 20 16:34:43.295859 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.295841 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.299376 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.299347 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-sxs7n\"" Apr 20 16:34:43.299376 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.299372 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 16:34:43.299560 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.299356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 16:34:43.307340 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.307319 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6fb96bbb9-9hscp"] Apr 20 16:34:43.396755 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.396724 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-maas-api-tls\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.396927 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.396792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sh9j\" (UniqueName: \"kubernetes.io/projected/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-kube-api-access-5sh9j\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.498076 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.498040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sh9j\" (UniqueName: \"kubernetes.io/projected/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-kube-api-access-5sh9j\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.498306 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.498186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-maas-api-tls\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.500877 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.500854 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-maas-api-tls\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.506154 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.506132 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sh9j\" (UniqueName: \"kubernetes.io/projected/9bc22a92-af8a-4a58-b0f1-f20b74e375ce-kube-api-access-5sh9j\") pod \"maas-api-6fb96bbb9-9hscp\" (UID: \"9bc22a92-af8a-4a58-b0f1-f20b74e375ce\") " pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.606533 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.606437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:43.733187 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.733144 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6fb96bbb9-9hscp"] Apr 20 16:34:43.735547 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:34:43.735518 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc22a92_af8a_4a58_b0f1_f20b74e375ce.slice/crio-a4a6abd06ba5e68be720d80bde95c84d4ee15270b27147a40995afac52166e18 WatchSource:0}: Error finding container a4a6abd06ba5e68be720d80bde95c84d4ee15270b27147a40995afac52166e18: Status 404 returned error can't find the container with id a4a6abd06ba5e68be720d80bde95c84d4ee15270b27147a40995afac52166e18 Apr 20 16:34:43.736756 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:43.736739 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:34:44.044847 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:44.044804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6fb96bbb9-9hscp" event={"ID":"9bc22a92-af8a-4a58-b0f1-f20b74e375ce","Type":"ContainerStarted","Data":"a4a6abd06ba5e68be720d80bde95c84d4ee15270b27147a40995afac52166e18"} Apr 20 16:34:46.055469 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:46.055432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6fb96bbb9-9hscp" event={"ID":"9bc22a92-af8a-4a58-b0f1-f20b74e375ce","Type":"ContainerStarted","Data":"3ad87af0218a183a8ef2072d0d3aa4a54eaadc17e1541e6f0bb99c9b73dbe1b1"} Apr 20 16:34:46.055899 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:46.055548 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:34:46.074530 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:46.074487 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6fb96bbb9-9hscp" podStartSLOduration=0.936476996 podStartE2EDuration="3.074473161s" podCreationTimestamp="2026-04-20 16:34:43 +0000 UTC" firstStartedPulling="2026-04-20 16:34:43.736855668 +0000 UTC m=+686.857032166" lastFinishedPulling="2026-04-20 16:34:45.874851832 +0000 UTC m=+688.995028331" observedRunningTime="2026-04-20 16:34:46.071738109 +0000 UTC m=+689.191914629" watchObservedRunningTime="2026-04-20 16:34:46.074473161 +0000 UTC m=+689.194649735" Apr 20 16:34:52.066199 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:34:52.066153 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6fb96bbb9-9hscp" Apr 20 16:35:29.001451 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:35:29.001415 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:35:41.001301 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:35:41.001258 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:35:54.805630 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:35:54.805595 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:35:57.996488 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:35:57.996452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:36:05.384812 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:05.384776 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-xqrl5"] Apr 20 16:36:52.008834 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.008753 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56b86c4857-c4g4f"] Apr 20 16:36:52.011123 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.011107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.014736 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.014714 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 16:36:52.014736 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.014743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 20 16:36:52.014887 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.014726 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-cbfmp\"" Apr 20 16:36:52.021488 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.021465 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56b86c4857-c4g4f"] Apr 20 16:36:52.125860 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.125821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e5d61ad4-023c-4886-84b0-9c6f100535de-oidc-ca\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.126042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.125871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lprz\" (UniqueName: \"kubernetes.io/projected/e5d61ad4-023c-4886-84b0-9c6f100535de-kube-api-access-6lprz\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.126042 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.125947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e5d61ad4-023c-4886-84b0-9c6f100535de-tls-cert\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.227134 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.227093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e5d61ad4-023c-4886-84b0-9c6f100535de-oidc-ca\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.227333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.227144 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lprz\" (UniqueName: \"kubernetes.io/projected/e5d61ad4-023c-4886-84b0-9c6f100535de-kube-api-access-6lprz\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.227333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.227218 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e5d61ad4-023c-4886-84b0-9c6f100535de-tls-cert\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.227786 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.227766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e5d61ad4-023c-4886-84b0-9c6f100535de-oidc-ca\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.229701 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.229684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e5d61ad4-023c-4886-84b0-9c6f100535de-tls-cert\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.236435 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.236417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lprz\" (UniqueName: \"kubernetes.io/projected/e5d61ad4-023c-4886-84b0-9c6f100535de-kube-api-access-6lprz\") pod \"authorino-56b86c4857-c4g4f\" (UID: \"e5d61ad4-023c-4886-84b0-9c6f100535de\") " pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.321390 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.321290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56b86c4857-c4g4f" Apr 20 16:36:52.449698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.449675 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56b86c4857-c4g4f"] Apr 20 16:36:52.451666 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:36:52.451637 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d61ad4_023c_4886_84b0_9c6f100535de.slice/crio-d3b2f84ec8978f6f073362a77d73db0ee0f47cac8e8fb9ce328545cbee2a5fb1 WatchSource:0}: Error finding container d3b2f84ec8978f6f073362a77d73db0ee0f47cac8e8fb9ce328545cbee2a5fb1: Status 404 returned error can't find the container with id d3b2f84ec8978f6f073362a77d73db0ee0f47cac8e8fb9ce328545cbee2a5fb1 Apr 20 16:36:52.545192 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:52.545133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-56b86c4857-c4g4f" event={"ID":"e5d61ad4-023c-4886-84b0-9c6f100535de","Type":"ContainerStarted","Data":"d3b2f84ec8978f6f073362a77d73db0ee0f47cac8e8fb9ce328545cbee2a5fb1"} Apr 20 16:36:55.562745 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:55.562640 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-56b86c4857-c4g4f" event={"ID":"e5d61ad4-023c-4886-84b0-9c6f100535de","Type":"ContainerStarted","Data":"e7316ba36333a82a06c35ce4a70619e50e85ed8bfb9c11df0c9042c1ec812109"} Apr 20 16:36:55.582795 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:36:55.582736 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-56b86c4857-c4g4f" podStartSLOduration=1.76215685 podStartE2EDuration="4.582721142s" podCreationTimestamp="2026-04-20 16:36:51 +0000 UTC" firstStartedPulling="2026-04-20 16:36:52.453310016 +0000 UTC m=+815.573486518" lastFinishedPulling="2026-04-20 16:36:55.273874305 +0000 UTC m=+818.394050810" observedRunningTime="2026-04-20 16:36:55.582020546 +0000 UTC m=+818.702197067" watchObservedRunningTime="2026-04-20 16:36:55.582721142 +0000 UTC m=+818.702897653" Apr 20 16:37:01.734111 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.734074 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg"] Apr 20 16:37:01.736941 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.736925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.739544 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.739521 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 16:37:01.739682 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.739521 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 16:37:01.739682 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.739524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rn5j\"" Apr 20 16:37:01.744400 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.744375 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg"] Apr 20 16:37:01.803536 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.803497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gd2k\" (UniqueName: \"kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.803536 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.803543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.803762 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.803652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.905117 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.905077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gd2k\" (UniqueName: \"kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.905333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.905130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.905333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.905240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.905633 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.905612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.905706 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.905644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:01.914084 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:01.914058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gd2k\" (UniqueName: \"kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:02.048546 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:02.048443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:02.184560 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:02.184533 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg"] Apr 20 16:37:02.186520 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:37:02.186491 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cf3eb02_c43c_4736_bb8f_233e8567cba9.slice/crio-d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93 WatchSource:0}: Error finding container d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93: Status 404 returned error can't find the container with id d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93 Apr 20 16:37:02.588827 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:02.588788 2571 generic.go:358] "Generic (PLEG): container finished" podID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerID="6d08447603f3619092b13e1c89669dbd1cc428405e19cc3a66fd8f94e64b64c9" exitCode=0 Apr 20 16:37:02.589002 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:02.588868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerDied","Data":"6d08447603f3619092b13e1c89669dbd1cc428405e19cc3a66fd8f94e64b64c9"} Apr 20 16:37:02.589002 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:02.588904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerStarted","Data":"d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93"} Apr 20 16:37:04.599131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:04.599088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerStarted","Data":"f6b51ea11a6ed29c98aef547f37e796ca6928d8d5b3ff57376b5e19e99efa346"} Apr 20 16:37:05.604874 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:05.604838 2571 generic.go:358] "Generic (PLEG): container finished" podID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerID="f6b51ea11a6ed29c98aef547f37e796ca6928d8d5b3ff57376b5e19e99efa346" exitCode=0 Apr 20 16:37:05.605367 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:05.604930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerDied","Data":"f6b51ea11a6ed29c98aef547f37e796ca6928d8d5b3ff57376b5e19e99efa346"} Apr 20 16:37:06.610890 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:06.610858 2571 generic.go:358] "Generic (PLEG): container finished" podID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerID="93fa6715e95c80300a4b8ab40b0c7668ad7187e069c577c076b8400783fd3059" exitCode=0 Apr 20 16:37:06.611297 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:06.610945 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerDied","Data":"93fa6715e95c80300a4b8ab40b0c7668ad7187e069c577c076b8400783fd3059"} Apr 20 16:37:07.742124 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.742102 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:07.756766 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.756744 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gd2k\" (UniqueName: \"kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k\") pod \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " Apr 20 16:37:07.756871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.756825 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle\") pod \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " Apr 20 16:37:07.756871 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.756862 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util\") pod \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\" (UID: \"6cf3eb02-c43c-4736-bb8f-233e8567cba9\") " Apr 20 16:37:07.757572 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.757537 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle" (OuterVolumeSpecName: "bundle") pod "6cf3eb02-c43c-4736-bb8f-233e8567cba9" (UID: "6cf3eb02-c43c-4736-bb8f-233e8567cba9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:37:07.759439 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.759383 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k" (OuterVolumeSpecName: "kube-api-access-4gd2k") pod "6cf3eb02-c43c-4736-bb8f-233e8567cba9" (UID: "6cf3eb02-c43c-4736-bb8f-233e8567cba9"). InnerVolumeSpecName "kube-api-access-4gd2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:07.762099 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.762073 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util" (OuterVolumeSpecName: "util") pod "6cf3eb02-c43c-4736-bb8f-233e8567cba9" (UID: "6cf3eb02-c43c-4736-bb8f-233e8567cba9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:37:07.858227 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.858191 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gd2k\" (UniqueName: \"kubernetes.io/projected/6cf3eb02-c43c-4736-bb8f-233e8567cba9-kube-api-access-4gd2k\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:07.858227 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.858225 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-bundle\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:07.858227 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:07.858235 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cf3eb02-c43c-4736-bb8f-233e8567cba9-util\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:08.620655 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:08.620620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" event={"ID":"6cf3eb02-c43c-4736-bb8f-233e8567cba9","Type":"ContainerDied","Data":"d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93"} Apr 20 16:37:08.620655 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:08.620660 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a321877c5756ddc5545eac449fa87605d3f93d63444bfa6f35bd72e3b54a93" Apr 20 16:37:08.620860 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:08.620698 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dwvjkg" Apr 20 16:37:17.395175 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395138 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt"] Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395509 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="pull" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395519 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="pull" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395529 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="util" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395535 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="util" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395544 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="extract" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395549 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="extract" Apr 20 16:37:17.395624 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.395609 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cf3eb02-c43c-4736-bb8f-233e8567cba9" containerName="extract" Apr 20 16:37:17.402689 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.402664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.410021 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.409989 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt"] Apr 20 16:37:17.445839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.445812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.445997 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.445873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zsj5\" (UniqueName: \"kubernetes.io/projected/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-kube-api-access-2zsj5\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.547100 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.547060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zsj5\" (UniqueName: \"kubernetes.io/projected/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-kube-api-access-2zsj5\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.547328 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.547203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.547628 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.547611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.556740 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.556707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zsj5\" (UniqueName: \"kubernetes.io/projected/df4e0365-d0bd-47e3-b3ca-8516ae1f73f0-kube-api-access-2zsj5\") pod \"cert-manager-operator-controller-manager-54b9655956-mnqtt\" (UID: \"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.713640 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.713536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" Apr 20 16:37:17.847933 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:17.847906 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt"] Apr 20 16:37:17.849839 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:37:17.849812 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4e0365_d0bd_47e3_b3ca_8516ae1f73f0.slice/crio-be2fa4902287cf868e50a4d9c81c5dbbfae5b32b8ae16cabb383256d86f012f7 WatchSource:0}: Error finding container be2fa4902287cf868e50a4d9c81c5dbbfae5b32b8ae16cabb383256d86f012f7: Status 404 returned error can't find the container with id be2fa4902287cf868e50a4d9c81c5dbbfae5b32b8ae16cabb383256d86f012f7 Apr 20 16:37:18.668105 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:18.668057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" event={"ID":"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0","Type":"ContainerStarted","Data":"be2fa4902287cf868e50a4d9c81c5dbbfae5b32b8ae16cabb383256d86f012f7"} Apr 20 16:37:19.673476 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:19.673425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" event={"ID":"df4e0365-d0bd-47e3-b3ca-8516ae1f73f0","Type":"ContainerStarted","Data":"ff227c20ee7f5dc2b58bb09c69b9bcbcced3340c75798c40b88a322145ac0e6d"} Apr 20 16:37:19.701076 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:19.701015 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-mnqtt" podStartSLOduration=1.272566279 podStartE2EDuration="2.700997554s" podCreationTimestamp="2026-04-20 16:37:17 +0000 UTC" firstStartedPulling="2026-04-20 16:37:17.852727944 +0000 UTC m=+840.972904445" lastFinishedPulling="2026-04-20 16:37:19.281159219 +0000 UTC m=+842.401335720" observedRunningTime="2026-04-20 16:37:19.699532223 +0000 UTC m=+842.819708744" watchObservedRunningTime="2026-04-20 16:37:19.700997554 +0000 UTC m=+842.821174074" Apr 20 16:37:19.741798 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:19.741762 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:37:19.742032 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:19.742003 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" podUID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" containerName="cert-manager-operator" containerID="cri-o://4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59" gracePeriod=10 Apr 20 16:37:20.024555 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.024525 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xq9vc"] Apr 20 16:37:20.027467 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.027441 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:37:20.028650 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.028625 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mjg7v"] Apr 20 16:37:20.028826 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.028770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.029004 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.028983 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" containerName="cert-manager-operator" Apr 20 16:37:20.029004 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.029003 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" containerName="cert-manager-operator" Apr 20 16:37:20.029214 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.029127 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" containerName="cert-manager-operator" Apr 20 16:37:20.032449 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.032427 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.033305 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.033283 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-x4lgr"] Apr 20 16:37:20.036834 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.036816 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.040665 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.040640 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-4m49s\"" Apr 20 16:37:20.050659 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.050325 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xq9vc"] Apr 20 16:37:20.053127 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.053031 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mjg7v"] Apr 20 16:37:20.056744 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.056698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-x4lgr"] Apr 20 16:37:20.071332 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.071299 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvxs\" (UniqueName: \"kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs\") pod \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " Apr 20 16:37:20.071465 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.071355 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp\") pod \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\" (UID: \"dbd35da9-b962-49b2-87f7-6d80e50b90ea\") " Apr 20 16:37:20.071750 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.071520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.071750 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.071555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrqb\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-kube-api-access-sxrqb\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.071959 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.071933 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp" (OuterVolumeSpecName: "tmp") pod "dbd35da9-b962-49b2-87f7-6d80e50b90ea" (UID: "dbd35da9-b962-49b2-87f7-6d80e50b90ea"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:37:20.074022 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.073993 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs" (OuterVolumeSpecName: "kube-api-access-zvvxs") pod "dbd35da9-b962-49b2-87f7-6d80e50b90ea" (UID: "dbd35da9-b962-49b2-87f7-6d80e50b90ea"). InnerVolumeSpecName "kube-api-access-zvvxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:20.172552 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdt7j\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-kube-api-access-mdt7j\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.172714 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.172714 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-bound-sa-token\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.172714 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172661 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b57n\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-kube-api-access-7b57n\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.172714 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.172865 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrqb\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-kube-api-access-sxrqb\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.172899 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172877 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvvxs\" (UniqueName: \"kubernetes.io/projected/dbd35da9-b962-49b2-87f7-6d80e50b90ea-kube-api-access-zvvxs\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:20.172933 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.172903 2571 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbd35da9-b962-49b2-87f7-6d80e50b90ea-tmp\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:20.199447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.199383 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrqb\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-kube-api-access-sxrqb\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.202262 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.202244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xq9vc\" (UID: \"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.274142 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.274102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b57n\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-kube-api-access-7b57n\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.274317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.274206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdt7j\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-kube-api-access-mdt7j\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.274317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.274237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.274317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.274290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-bound-sa-token\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.292103 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.292064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.292251 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.292144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-bound-sa-token\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.292251 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.292209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdt7j\" (UniqueName: \"kubernetes.io/projected/a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e-kube-api-access-mdt7j\") pod \"cert-manager-79c8d999ff-x4lgr\" (UID: \"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e\") " pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.292338 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.292293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b57n\" (UniqueName: \"kubernetes.io/projected/86b74fbe-ce20-4a3f-ace7-2315e151284e-kube-api-access-7b57n\") pod \"cert-manager-cainjector-68b757865b-mjg7v\" (UID: \"86b74fbe-ce20-4a3f-ace7-2315e151284e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.343921 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.343888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:20.351691 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.351668 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" Apr 20 16:37:20.359508 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.359478 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-x4lgr" Apr 20 16:37:20.535991 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.535963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xq9vc"] Apr 20 16:37:20.537060 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:37:20.537032 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07d1ad7b_eb82_4c2c_a46e_dbb46ad2342d.slice/crio-7c82722d8644a88869be6166afaee081387cebe38af54fcff57f6b8450a7685a WatchSource:0}: Error finding container 7c82722d8644a88869be6166afaee081387cebe38af54fcff57f6b8450a7685a: Status 404 returned error can't find the container with id 7c82722d8644a88869be6166afaee081387cebe38af54fcff57f6b8450a7685a Apr 20 16:37:20.678215 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.678155 2571 generic.go:358] "Generic (PLEG): container finished" podID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" containerID="4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59" exitCode=0 Apr 20 16:37:20.678652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.678235 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" Apr 20 16:37:20.678652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.678253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" event={"ID":"dbd35da9-b962-49b2-87f7-6d80e50b90ea","Type":"ContainerDied","Data":"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59"} Apr 20 16:37:20.678652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.678288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf" event={"ID":"dbd35da9-b962-49b2-87f7-6d80e50b90ea","Type":"ContainerDied","Data":"da783a7a5e0ad8bb41b457027c0c6b0b0d853b5ea4351e9ef1f6fe73d75fcdbb"} Apr 20 16:37:20.678652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.678307 2571 scope.go:117] "RemoveContainer" containerID="4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59" Apr 20 16:37:20.679457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.679436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" event={"ID":"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d","Type":"ContainerStarted","Data":"7c82722d8644a88869be6166afaee081387cebe38af54fcff57f6b8450a7685a"} Apr 20 16:37:20.687366 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.687349 2571 scope.go:117] "RemoveContainer" containerID="4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59" Apr 20 16:37:20.687611 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:37:20.687592 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59\": container with ID starting with 4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59 not found: ID does not exist" containerID="4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59" Apr 20 16:37:20.687667 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.687619 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59"} err="failed to get container status \"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59\": rpc error: code = NotFound desc = could not find container \"4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59\": container with ID starting with 4d1f2d1b687237d47900c128937abced0d19eb0030f50fb95a0450ff0321be59 not found: ID does not exist" Apr 20 16:37:20.716645 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.716576 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:37:20.725470 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.725439 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-54qdf"] Apr 20 16:37:20.772208 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.772012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-mjg7v"] Apr 20 16:37:20.772208 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:20.772062 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-x4lgr"] Apr 20 16:37:20.776313 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:37:20.776198 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5a01b71_33c8_4b2b_acea_4fb5f8b7e03e.slice/crio-1556c767af97e6f5b38135ccdba0100991353750e3ea6283b2e9d76ff482bf39 WatchSource:0}: Error finding container 1556c767af97e6f5b38135ccdba0100991353750e3ea6283b2e9d76ff482bf39: Status 404 returned error can't find the container with id 1556c767af97e6f5b38135ccdba0100991353750e3ea6283b2e9d76ff482bf39 Apr 20 16:37:21.426365 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:21.426330 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd35da9-b962-49b2-87f7-6d80e50b90ea" path="/var/lib/kubelet/pods/dbd35da9-b962-49b2-87f7-6d80e50b90ea/volumes" Apr 20 16:37:21.687717 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:21.687597 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-x4lgr" event={"ID":"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e","Type":"ContainerStarted","Data":"1556c767af97e6f5b38135ccdba0100991353750e3ea6283b2e9d76ff482bf39"} Apr 20 16:37:21.692547 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:21.692239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" event={"ID":"86b74fbe-ce20-4a3f-ace7-2315e151284e","Type":"ContainerStarted","Data":"036837b78b161bd976841ffa4da69fd3fa0fbbd6d2ada325166a9737cb010a88"} Apr 20 16:37:23.702126 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.702089 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-x4lgr" event={"ID":"a5a01b71-33c8-4b2b-acea-4fb5f8b7e03e","Type":"ContainerStarted","Data":"6da1066881876ce7b2bdb5844466a2e1e9b6cd27fc18dd594030933bf588a1cf"} Apr 20 16:37:23.703658 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.703627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" event={"ID":"07d1ad7b-eb82-4c2c-a46e-dbb46ad2342d","Type":"ContainerStarted","Data":"58e912c695cb612902a009492b603ace8ae4067a044ac9f845030009d7b63023"} Apr 20 16:37:23.703780 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.703682 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:23.705105 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.705084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" event={"ID":"86b74fbe-ce20-4a3f-ace7-2315e151284e","Type":"ContainerStarted","Data":"37df19c3c6e3a946318c5d6beb8b06cd62df6b3f08ec0ceb143ab2e99de6046c"} Apr 20 16:37:23.719209 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.719142 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-x4lgr" podStartSLOduration=1.325380456 podStartE2EDuration="3.719127318s" podCreationTimestamp="2026-04-20 16:37:20 +0000 UTC" firstStartedPulling="2026-04-20 16:37:20.77811825 +0000 UTC m=+843.898294747" lastFinishedPulling="2026-04-20 16:37:23.171865096 +0000 UTC m=+846.292041609" observedRunningTime="2026-04-20 16:37:23.717612949 +0000 UTC m=+846.837789470" watchObservedRunningTime="2026-04-20 16:37:23.719127318 +0000 UTC m=+846.839303838" Apr 20 16:37:23.735090 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.735036 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" podStartSLOduration=2.117187245 podStartE2EDuration="4.735026034s" podCreationTimestamp="2026-04-20 16:37:19 +0000 UTC" firstStartedPulling="2026-04-20 16:37:20.539515518 +0000 UTC m=+843.659692016" lastFinishedPulling="2026-04-20 16:37:23.157354305 +0000 UTC m=+846.277530805" observedRunningTime="2026-04-20 16:37:23.732001399 +0000 UTC m=+846.852177920" watchObservedRunningTime="2026-04-20 16:37:23.735026034 +0000 UTC m=+846.855202554" Apr 20 16:37:23.755503 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.755282 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-mjg7v" podStartSLOduration=2.367725019 podStartE2EDuration="4.755263449s" podCreationTimestamp="2026-04-20 16:37:19 +0000 UTC" firstStartedPulling="2026-04-20 16:37:20.774517669 +0000 UTC m=+843.894694167" lastFinishedPulling="2026-04-20 16:37:23.162056099 +0000 UTC m=+846.282232597" observedRunningTime="2026-04-20 16:37:23.752239244 +0000 UTC m=+846.872415767" watchObservedRunningTime="2026-04-20 16:37:23.755263449 +0000 UTC m=+846.875439970" Apr 20 16:37:23.784851 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.784815 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:37:23.785095 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:23.785069 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" podUID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" containerName="cert-manager-cainjector" containerID="cri-o://835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6" gracePeriod=30 Apr 20 16:37:24.025001 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.024978 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:37:24.108622 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.108585 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token\") pod \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " Apr 20 16:37:24.108786 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.108648 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6zds\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds\") pod \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\" (UID: \"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1\") " Apr 20 16:37:24.110852 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.110824 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" (UID: "00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:24.110852 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.110842 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds" (OuterVolumeSpecName: "kube-api-access-z6zds") pod "00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" (UID: "00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1"). InnerVolumeSpecName "kube-api-access-z6zds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:24.210275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.210193 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-bound-sa-token\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:24.210275 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.210225 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6zds\" (UniqueName: \"kubernetes.io/projected/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1-kube-api-access-z6zds\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:24.709644 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.709600 2571 generic.go:358] "Generic (PLEG): container finished" podID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" containerID="835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6" exitCode=0 Apr 20 16:37:24.710086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.709650 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" Apr 20 16:37:24.710086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.709692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" event={"ID":"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1","Type":"ContainerDied","Data":"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6"} Apr 20 16:37:24.710086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.709744 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-tx5ll" event={"ID":"00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1","Type":"ContainerDied","Data":"19ac8885be9663861f6d174a9fb98de6c788a50185d5b53edefc5569efe3c95b"} Apr 20 16:37:24.710086 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.709762 2571 scope.go:117] "RemoveContainer" containerID="835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6" Apr 20 16:37:24.724072 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.724050 2571 scope.go:117] "RemoveContainer" containerID="835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6" Apr 20 16:37:24.724344 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:37:24.724323 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6\": container with ID starting with 835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6 not found: ID does not exist" containerID="835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6" Apr 20 16:37:24.724393 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.724352 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6"} err="failed to get container status \"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6\": rpc error: code = NotFound desc = could not find container \"835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6\": container with ID starting with 835186f8996e95fbcddb4bae083c93bd55c1a69109b3c64eca71b31c06491bf6 not found: ID does not exist" Apr 20 16:37:24.735687 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.735663 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:37:24.740048 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:24.740017 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-tx5ll"] Apr 20 16:37:25.422425 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:25.422393 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" path="/var/lib/kubelet/pods/00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1/volumes" Apr 20 16:37:29.712285 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:29.712252 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-xq9vc" Apr 20 16:37:29.763564 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:29.763531 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:37:29.763791 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:29.763752 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" podUID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" containerName="cert-manager-webhook" containerID="cri-o://41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c" gracePeriod=30 Apr 20 16:37:30.018566 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.018539 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:37:30.165553 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.165516 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbfj7\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7\") pod \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " Apr 20 16:37:30.165735 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.165603 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token\") pod \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\" (UID: \"395c99df-a1dd-44da-a0aa-fa7e2b9c411e\") " Apr 20 16:37:30.167786 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.167751 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7" (OuterVolumeSpecName: "kube-api-access-cbfj7") pod "395c99df-a1dd-44da-a0aa-fa7e2b9c411e" (UID: "395c99df-a1dd-44da-a0aa-fa7e2b9c411e"). InnerVolumeSpecName "kube-api-access-cbfj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:30.167786 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.167758 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "395c99df-a1dd-44da-a0aa-fa7e2b9c411e" (UID: "395c99df-a1dd-44da-a0aa-fa7e2b9c411e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:30.266607 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.266517 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbfj7\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-kube-api-access-cbfj7\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:30.266607 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.266549 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/395c99df-a1dd-44da-a0aa-fa7e2b9c411e-bound-sa-token\") on node \"ip-10-0-130-72.ec2.internal\" DevicePath \"\"" Apr 20 16:37:30.738886 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.738849 2571 generic.go:358] "Generic (PLEG): container finished" podID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" containerID="41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c" exitCode=0 Apr 20 16:37:30.739334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.738893 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" Apr 20 16:37:30.739334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.738936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" event={"ID":"395c99df-a1dd-44da-a0aa-fa7e2b9c411e","Type":"ContainerDied","Data":"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c"} Apr 20 16:37:30.739334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.738968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-qgzqq" event={"ID":"395c99df-a1dd-44da-a0aa-fa7e2b9c411e","Type":"ContainerDied","Data":"01255f051356196dba9236b303fcc7bfe0472114ddd920c37765daecd95a17f2"} Apr 20 16:37:30.739334 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.738984 2571 scope.go:117] "RemoveContainer" containerID="41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c" Apr 20 16:37:30.748611 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.748594 2571 scope.go:117] "RemoveContainer" containerID="41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c" Apr 20 16:37:30.748856 ip-10-0-130-72 kubenswrapper[2571]: E0420 16:37:30.748838 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c\": container with ID starting with 41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c not found: ID does not exist" containerID="41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c" Apr 20 16:37:30.748910 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.748865 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c"} err="failed to get container status \"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c\": rpc error: code = NotFound desc = could not find container \"41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c\": container with ID starting with 41d680e43e056ce7dc6abcfc0b3a474110bd6b977821f554e87915691cbf110c not found: ID does not exist" Apr 20 16:37:30.761775 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.761748 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:37:30.765378 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:30.765358 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-qgzqq"] Apr 20 16:37:31.423933 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:31.423899 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" path="/var/lib/kubelet/pods/395c99df-a1dd-44da-a0aa-fa7e2b9c411e/volumes" Apr 20 16:37:34.688120 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:34.688090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-56b86c4857-c4g4f_e5d61ad4-023c-4886-84b0-9c6f100535de/authorino/0.log" Apr 20 16:37:38.876073 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:38.876047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6fb96bbb9-9hscp_9bc22a92-af8a-4a58-b0f1-f20b74e375ce/maas-api/0.log" Apr 20 16:37:39.247709 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:39.247631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-vdbl8_6011a6ba-7e4c-4e6b-be9c-31101383f90d/manager/0.log" Apr 20 16:37:40.378173 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.378141 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/util/0.log" Apr 20 16:37:40.384231 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.384205 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/pull/0.log" Apr 20 16:37:40.390246 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.390226 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/extract/0.log" Apr 20 16:37:40.510562 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.510533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/util/0.log" Apr 20 16:37:40.517548 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.517527 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/pull/0.log" Apr 20 16:37:40.526473 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.526455 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/extract/0.log" Apr 20 16:37:40.644156 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.644075 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/util/0.log" Apr 20 16:37:40.651206 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.651185 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/pull/0.log" Apr 20 16:37:40.658145 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.658127 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/extract/0.log" Apr 20 16:37:40.773507 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.773478 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/util/0.log" Apr 20 16:37:40.780333 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.780308 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/pull/0.log" Apr 20 16:37:40.786866 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.786846 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/extract/0.log" Apr 20 16:37:40.900233 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:40.900142 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-56b86c4857-c4g4f_e5d61ad4-023c-4886-84b0-9c6f100535de/authorino/0.log" Apr 20 16:37:41.260038 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:41.259961 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-5x88n_f1716bcd-859c-4a38-b028-6564913bbe3d/kuadrant-console-plugin/0.log" Apr 20 16:37:41.633826 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:41.633798 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-xqrl5_e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0/limitador/0.log" Apr 20 16:37:42.460604 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:42.460571 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9c9c888c-nlwxb_19c4538a-4617-43a8-ac59-14dda186c360/kube-auth-proxy/0.log" Apr 20 16:37:42.699100 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:42.699069 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8dbb8fdd6-27rh2_542e12c0-876b-401f-b987-efaf65039572/router/0.log" Apr 20 16:37:47.536459 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.536426 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wg92x/must-gather-kzhpr"] Apr 20 16:37:47.536911 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.536894 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" containerName="cert-manager-webhook" Apr 20 16:37:47.536911 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.536912 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" containerName="cert-manager-webhook" Apr 20 16:37:47.537024 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.536947 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" containerName="cert-manager-cainjector" Apr 20 16:37:47.537024 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.536955 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" containerName="cert-manager-cainjector" Apr 20 16:37:47.537105 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.537027 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="00eb42b0-e5c0-4e3e-a2d2-2658145ac2a1" containerName="cert-manager-cainjector" Apr 20 16:37:47.537105 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.537043 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="395c99df-a1dd-44da-a0aa-fa7e2b9c411e" containerName="cert-manager-webhook" Apr 20 16:37:47.540542 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.540518 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.543305 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.543281 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"openshift-service-ca.crt\"" Apr 20 16:37:47.544734 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.544712 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"kube-root-ca.crt\"" Apr 20 16:37:47.544884 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.544864 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wg92x\"/\"default-dockercfg-9rm4f\"" Apr 20 16:37:47.548629 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.548599 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/must-gather-kzhpr"] Apr 20 16:37:47.619839 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.619802 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g779m\" (UniqueName: \"kubernetes.io/projected/6874baad-7713-4d6e-a1df-46c4eb55504f-kube-api-access-g779m\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.620022 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.619851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6874baad-7713-4d6e-a1df-46c4eb55504f-must-gather-output\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.720468 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.720436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6874baad-7713-4d6e-a1df-46c4eb55504f-must-gather-output\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.720681 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.720539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g779m\" (UniqueName: \"kubernetes.io/projected/6874baad-7713-4d6e-a1df-46c4eb55504f-kube-api-access-g779m\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.720789 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.720767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6874baad-7713-4d6e-a1df-46c4eb55504f-must-gather-output\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.730310 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.730282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g779m\" (UniqueName: \"kubernetes.io/projected/6874baad-7713-4d6e-a1df-46c4eb55504f-kube-api-access-g779m\") pod \"must-gather-kzhpr\" (UID: \"6874baad-7713-4d6e-a1df-46c4eb55504f\") " pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.852698 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.852616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/must-gather-kzhpr" Apr 20 16:37:47.983764 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:47.983737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/must-gather-kzhpr"] Apr 20 16:37:47.985003 ip-10-0-130-72 kubenswrapper[2571]: W0420 16:37:47.984975 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6874baad_7713_4d6e_a1df_46c4eb55504f.slice/crio-ee658cfd6f2ba22b39995eb0bb62207fb4adc0e5ea510682068edc6b3b9ce97d WatchSource:0}: Error finding container ee658cfd6f2ba22b39995eb0bb62207fb4adc0e5ea510682068edc6b3b9ce97d: Status 404 returned error can't find the container with id ee658cfd6f2ba22b39995eb0bb62207fb4adc0e5ea510682068edc6b3b9ce97d Apr 20 16:37:48.813077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:48.813044 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/must-gather-kzhpr" event={"ID":"6874baad-7713-4d6e-a1df-46c4eb55504f","Type":"ContainerStarted","Data":"ee658cfd6f2ba22b39995eb0bb62207fb4adc0e5ea510682068edc6b3b9ce97d"} Apr 20 16:37:49.819611 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:49.819567 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/must-gather-kzhpr" event={"ID":"6874baad-7713-4d6e-a1df-46c4eb55504f","Type":"ContainerStarted","Data":"2d31f69bf34da2be97adb95c7f7e8bb839de75e4b263e691fd6c5d8318523b97"} Apr 20 16:37:49.819611 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:49.819620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/must-gather-kzhpr" event={"ID":"6874baad-7713-4d6e-a1df-46c4eb55504f","Type":"ContainerStarted","Data":"6530ceeeac0cb6767a440971d83d4e35a80a59977910ddfcd8c9f0b8541b7009"} Apr 20 16:37:49.836711 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:49.836644 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wg92x/must-gather-kzhpr" podStartSLOduration=2.043253729 podStartE2EDuration="2.836624088s" podCreationTimestamp="2026-04-20 16:37:47 +0000 UTC" firstStartedPulling="2026-04-20 16:37:47.986708928 +0000 UTC m=+871.106885426" lastFinishedPulling="2026-04-20 16:37:48.780079284 +0000 UTC m=+871.900255785" observedRunningTime="2026-04-20 16:37:49.835380535 +0000 UTC m=+872.955557056" watchObservedRunningTime="2026-04-20 16:37:49.836624088 +0000 UTC m=+872.956800611" Apr 20 16:37:50.431563 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:50.431531 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x9ldf_d1a2e82d-6f85-471f-a08a-55a114f41ec6/global-pull-secret-syncer/0.log" Apr 20 16:37:50.538570 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:50.538530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nhdcv_de954181-20e6-42cb-ac40-d96f0331e7a1/konnectivity-agent/0.log" Apr 20 16:37:50.557057 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:50.557029 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-72.ec2.internal_34da2f3dcf91c4f795ad835e4ed72c8c/haproxy/0.log" Apr 20 16:37:54.665979 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.665941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/extract/0.log" Apr 20 16:37:54.691606 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.691545 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/util/0.log" Apr 20 16:37:54.721472 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.721431 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7597r4n2_0f9c4347-3ae4-48ba-9142-9aa5ce8df783/pull/0.log" Apr 20 16:37:54.752520 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.752423 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/extract/0.log" Apr 20 16:37:54.777199 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.777137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/util/0.log" Apr 20 16:37:54.799570 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.799472 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0k9znl_05e709f1-04d6-4de7-b38d-80398e1f8255/pull/0.log" Apr 20 16:37:54.827874 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.827835 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/extract/0.log" Apr 20 16:37:54.853219 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.853189 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/util/0.log" Apr 20 16:37:54.878694 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.878665 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g5vfr_860a5367-7536-4db2-969c-d079797aa3e3/pull/0.log" Apr 20 16:37:54.908545 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.908496 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/extract/0.log" Apr 20 16:37:54.932577 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.932484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/util/0.log" Apr 20 16:37:54.954771 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.954744 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fm6zd_1be823da-8d73-4baf-8a47-9c0093d9e2bf/pull/0.log" Apr 20 16:37:54.986075 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:54.986028 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-56b86c4857-c4g4f_e5d61ad4-023c-4886-84b0-9c6f100535de/authorino/0.log" Apr 20 16:37:55.066955 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:55.066926 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-5x88n_f1716bcd-859c-4a38-b028-6564913bbe3d/kuadrant-console-plugin/0.log" Apr 20 16:37:55.170181 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:55.170140 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-xqrl5_e9b54d27-da6a-4b7c-be5e-1ccc0a85d3b0/limitador/0.log" Apr 20 16:37:56.621048 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.621011 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/alertmanager/0.log" Apr 20 16:37:56.654474 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.654437 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/config-reloader/0.log" Apr 20 16:37:56.682807 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.682771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/kube-rbac-proxy-web/0.log" Apr 20 16:37:56.708411 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.708377 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/kube-rbac-proxy/0.log" Apr 20 16:37:56.730848 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.730810 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/kube-rbac-proxy-metric/0.log" Apr 20 16:37:56.757074 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.756966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/prom-label-proxy/0.log" Apr 20 16:37:56.780961 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.780927 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_55422fe6-0d03-4e20-95b2-44fa62edfdba/init-config-reloader/0.log" Apr 20 16:37:56.816555 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.816521 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-rqblb_b2a0a168-de3a-4df1-89e1-c8d831ef5ada/cluster-monitoring-operator/0.log" Apr 20 16:37:56.844760 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.844738 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wgtbk_455b95ec-d6c4-4986-99be-0bf8c2e95935/kube-state-metrics/0.log" Apr 20 16:37:56.865777 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.865752 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wgtbk_455b95ec-d6c4-4986-99be-0bf8c2e95935/kube-rbac-proxy-main/0.log" Apr 20 16:37:56.888077 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.887998 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-wgtbk_455b95ec-d6c4-4986-99be-0bf8c2e95935/kube-rbac-proxy-self/0.log" Apr 20 16:37:56.921610 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:56.921580 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-78d6dfc49-4vn24_259deb39-6cd0-4be8-bc17-e4bd29b87f5d/metrics-server/0.log" Apr 20 16:37:57.066857 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.066823 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kh54r_ecdbf87a-e49b-4914-9b96-abd064658c90/node-exporter/0.log" Apr 20 16:37:57.094831 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.094800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kh54r_ecdbf87a-e49b-4914-9b96-abd064658c90/kube-rbac-proxy/0.log" Apr 20 16:37:57.119667 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.119631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kh54r_ecdbf87a-e49b-4914-9b96-abd064658c90/init-textfile/0.log" Apr 20 16:37:57.495393 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.495341 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-czmjr_598a6714-5291-4598-8125-ab116843849d/prometheus-operator/0.log" Apr 20 16:37:57.518035 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.517966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-czmjr_598a6714-5291-4598-8125-ab116843849d/kube-rbac-proxy/0.log" Apr 20 16:37:57.547294 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.547264 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6k8cm_9ef807a0-8b32-44f3-97d4-07bfc892741a/prometheus-operator-admission-webhook/0.log" Apr 20 16:37:57.579131 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.579104 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55569768f8-sx8nf_2d13ef38-5530-4514-98a7-486963962908/telemeter-client/0.log" Apr 20 16:37:57.602258 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.602225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55569768f8-sx8nf_2d13ef38-5530-4514-98a7-486963962908/reload/0.log" Apr 20 16:37:57.624427 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:57.624399 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55569768f8-sx8nf_2d13ef38-5530-4514-98a7-486963962908/kube-rbac-proxy/0.log" Apr 20 16:37:59.097597 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.097559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b"] Apr 20 16:37:59.102733 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.102703 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.108843 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.108804 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b"] Apr 20 16:37:59.133772 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.133731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbk5\" (UniqueName: \"kubernetes.io/projected/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-kube-api-access-wzbk5\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.133936 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.133782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-proc\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.133936 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.133871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-lib-modules\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.133936 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.133924 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-sys\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.134122 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.133944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-podres\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-lib-modules\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-sys\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-podres\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbk5\" (UniqueName: \"kubernetes.io/projected/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-kube-api-access-wzbk5\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-proc\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-lib-modules\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-sys\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-proc\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.235690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.235662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-podres\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.245203 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.245152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbk5\" (UniqueName: \"kubernetes.io/projected/ba6c0906-9b25-4b81-8bc5-4269facdfd2c-kube-api-access-wzbk5\") pod \"perf-node-gather-daemonset-7dh2b\" (UID: \"ba6c0906-9b25-4b81-8bc5-4269facdfd2c\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.442467 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.442437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.626247 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.626202 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b"] Apr 20 16:37:59.851522 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.851491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-c886g_671503df-d3e2-439c-8805-f38f49057176/download-server/0.log" Apr 20 16:37:59.877097 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.877064 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" event={"ID":"ba6c0906-9b25-4b81-8bc5-4269facdfd2c","Type":"ContainerStarted","Data":"89027bf8dda1001bf2ed2d3bb255b373cc062a42500b069343c5c1665ea7e1b5"} Apr 20 16:37:59.877354 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.877340 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:37:59.877485 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:37:59.877470 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" event={"ID":"ba6c0906-9b25-4b81-8bc5-4269facdfd2c","Type":"ContainerStarted","Data":"83ff255773ad5f1a4535211892c083ba0f6ae88503e618e297fec299dd7b8f61"} Apr 20 16:38:00.395457 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:00.395423 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-q5q8v_79391e4d-48bf-4541-a5e5-ad615c67502e/volume-data-source-validator/0.log" Apr 20 16:38:01.227416 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:01.227386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mmqjp_1d3f1231-a687-4b5c-b5b5-d078c34b831b/dns/0.log" Apr 20 16:38:01.252272 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:01.252242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mmqjp_1d3f1231-a687-4b5c-b5b5-d078c34b831b/kube-rbac-proxy/0.log" Apr 20 16:38:01.322422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:01.322395 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkvft_2e64cb9b-1c5d-4de7-9ce4-b673e8576d87/dns-node-resolver/0.log" Apr 20 16:38:01.844868 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:01.844838 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cklt7_842542e9-94b5-494f-8110-018afb1c0a5f/node-ca/0.log" Apr 20 16:38:02.841204 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:02.841129 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9c9c888c-nlwxb_19c4538a-4617-43a8-ac59-14dda186c360/kube-auth-proxy/0.log" Apr 20 16:38:02.893026 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:02.892932 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-8dbb8fdd6-27rh2_542e12c0-876b-401f-b987-efaf65039572/router/0.log" Apr 20 16:38:03.413268 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:03.413220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-np9bb_6da97721-f2ed-4061-b7a4-2577d2b33d11/serve-healthcheck-canary/0.log" Apr 20 16:38:04.184273 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:04.184234 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d4rs6_cf89efb4-eb7e-40aa-b0ad-4e4a47685ece/kube-rbac-proxy/0.log" Apr 20 16:38:04.229035 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:04.229001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d4rs6_cf89efb4-eb7e-40aa-b0ad-4e4a47685ece/exporter/0.log" Apr 20 16:38:04.272830 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:04.272799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d4rs6_cf89efb4-eb7e-40aa-b0ad-4e4a47685ece/extractor/0.log" Apr 20 16:38:05.893271 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:05.893246 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" Apr 20 16:38:05.911136 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:05.911086 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-7dh2b" podStartSLOduration=6.911071951 podStartE2EDuration="6.911071951s" podCreationTimestamp="2026-04-20 16:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:37:59.89729081 +0000 UTC m=+883.017467355" watchObservedRunningTime="2026-04-20 16:38:05.911071951 +0000 UTC m=+889.031248471" Apr 20 16:38:06.161504 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:06.161389 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6fb96bbb9-9hscp_9bc22a92-af8a-4a58-b0f1-f20b74e375ce/maas-api/0.log" Apr 20 16:38:06.259308 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:06.259283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-vdbl8_6011a6ba-7e4c-4e6b-be9c-31101383f90d/manager/0.log" Apr 20 16:38:07.703472 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:07.703432 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-d6fdb785c-jbqsb_300ed2fb-cfbe-42fd-907a-bc5cfd2dff10/manager/0.log" Apr 20 16:38:12.574139 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:12.574104 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-26lqh_164b3066-3171-4ff5-b023-f49f644b1d28/kube-storage-version-migrator-operator/1.log" Apr 20 16:38:12.575198 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:12.575175 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-26lqh_164b3066-3171-4ff5-b023-f49f644b1d28/kube-storage-version-migrator-operator/0.log" Apr 20 16:38:13.627095 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.627062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9cgxf_b1f64d16-8a19-4426-9f62-eaf3e9325026/kube-multus/0.log" Apr 20 16:38:13.825222 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.825111 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/kube-multus-additional-cni-plugins/0.log" Apr 20 16:38:13.847447 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.847414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/egress-router-binary-copy/0.log" Apr 20 16:38:13.869442 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.869419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/cni-plugins/0.log" Apr 20 16:38:13.890948 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.890881 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/bond-cni-plugin/0.log" Apr 20 16:38:13.912704 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.912680 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/routeoverride-cni/0.log" Apr 20 16:38:13.935781 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.935757 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/whereabouts-cni-bincopy/0.log" Apr 20 16:38:13.957338 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:13.957315 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jl5fs_975fe906-de1d-4b78-9555-abc5fd12991c/whereabouts-cni/0.log" Apr 20 16:38:14.245282 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:14.245211 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rxwd9_ff512ace-f73c-4265-890e-b43c9ecc782d/network-metrics-daemon/0.log" Apr 20 16:38:14.267247 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:14.267203 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rxwd9_ff512ace-f73c-4265-890e-b43c9ecc782d/kube-rbac-proxy/0.log" Apr 20 16:38:15.144652 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.144623 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-controller/0.log" Apr 20 16:38:15.165066 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.165037 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/0.log" Apr 20 16:38:15.169317 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.169290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovn-acl-logging/1.log" Apr 20 16:38:15.188502 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.188473 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/kube-rbac-proxy-node/0.log" Apr 20 16:38:15.210422 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.210388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 16:38:15.235102 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.235076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/northd/0.log" Apr 20 16:38:15.255862 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.255821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/nbdb/0.log" Apr 20 16:38:15.276739 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.276718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/sbdb/0.log" Apr 20 16:38:15.407690 ip-10-0-130-72 kubenswrapper[2571]: I0420 16:38:15.407611 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-45msc_a0a53203-f6d4-43f0-a422-5ae876b369f1/ovnkube-controller/0.log"