Apr 16 23:50:19.095843 ip-10-0-134-103 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:50:19.590116 ip-10-0-134-103 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:19.590116 ip-10-0-134-103 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:50:19.590116 ip-10-0-134-103 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:19.590116 ip-10-0-134-103 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:50:19.590116 ip-10-0-134-103 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:19.593652 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.593524 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:50:19.595757 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595743 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:19.595757 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595757 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595761 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595764 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595767 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595772 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595775 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595779 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595782 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595784 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595788 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595790 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595793 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595797 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595801 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595804 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595807 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595810 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595812 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595815 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:19.595815 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595818 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595821 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595824 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595827 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595830 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595832 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595835 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595837 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595840 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595843 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595846 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595848 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595851 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595853 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595856 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595858 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595861 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595863 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595866 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595868 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:19.596317 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595871 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595873 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595876 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595879 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595882 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595884 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595887 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595889 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595892 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595894 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595897 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595900 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595902 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595905 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595908 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595913 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595916 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595918 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595921 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595924 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:19.597132 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595927 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595929 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595932 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595934 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595937 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595939 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595942 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595944 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595947 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595949 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595952 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595955 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595957 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595959 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595962 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595965 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595968 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595972 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595974 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595977 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:19.597644 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595980 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:19.598163 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595982 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:19.598163 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595985 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:19.598163 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595987 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:19.598163 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595990 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:19.598163 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.595992 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:19.599655 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599638 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:19.599655 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599652 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:19.599655 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599655 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:19.599655 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599659 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599662 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599665 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599668 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599671 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599674 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599677 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599679 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599682 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599685 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599688 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599691 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599693 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599696 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599698 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599701 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599704 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599706 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599709 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599712 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:19.599843 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599714 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599717 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599720 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599722 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599725 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599728 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599734 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599737 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599740 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599742 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599746 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599748 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599751 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599754 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599757 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599760 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599763 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599766 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599768 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599771 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:19.600373 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599774 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599776 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599779 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599781 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599784 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599787 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599789 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599793 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599796 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599799 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599801 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599804 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599806 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599810 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599812 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599815 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599818 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599820 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599823 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599825 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:19.600930 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599828 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599830 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599833 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599835 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599838 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599841 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599846 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599850 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599853 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599856 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599859 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599862 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599865 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599868 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599871 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599875 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599878 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599882 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599885 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:19.601423 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599887 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599890 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599893 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.599897 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599961 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599967 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599973 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599978 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599983 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599988 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.599995 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600002 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600007 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600010 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600014 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600017 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600020 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600023 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600027 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600030 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600033 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600036 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600039 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600044 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:50:19.601906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600047 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600050 2578 flags.go:64] FLAG: --config-dir="" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600053 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600056 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600060 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600065 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600068 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600071 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600074 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600077 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600081 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600084 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600087 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600091 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600094 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600097 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600100 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600103 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600106 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600111 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600114 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600117 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600120 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600124 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600128 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:50:19.602477 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600130 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600134 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600138 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600141 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600144 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600147 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600150 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600153 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600157 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600160 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600163 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600166 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600170 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600174 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600177 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600180 2578 flags.go:64] FLAG: --help="false" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600183 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600186 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600189 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600192 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600196 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600199 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600202 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:50:19.603174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600205 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600208 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600211 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600213 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600217 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600219 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600223 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600226 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600229 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600231 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600234 2578 flags.go:64] FLAG: --lock-file="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600240 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600243 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600246 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600251 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600255 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600258 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600261 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600264 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600267 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600270 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600273 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600277 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600281 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600285 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 23:50:19.603795 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600289 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600291 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600294 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600297 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600300 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600303 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600306 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600313 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600316 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600319 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600322 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600325 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600330 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600335 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600338 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600341 2578 flags.go:64] FLAG: --port="10250" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600344 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600347 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c4c0d3d764335107" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600350 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600354 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600357 2578 flags.go:64] FLAG: --register-node="true" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600360 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600364 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600367 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:50:19.604402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600370 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600373 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600376 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600383 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600386 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600389 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600393 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600396 2578 flags.go:64] FLAG: --runonce="false" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600399 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600403 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600405 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600408 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600411 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600415 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600418 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600421 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600424 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600428 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600430 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600433 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600436 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600440 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600443 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600449 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600452 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:50:19.605003 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600454 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600458 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600462 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600465 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600469 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600471 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600474 2578 flags.go:64] FLAG: --v="2" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600478 2578 flags.go:64] FLAG: --version="false" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600483 2578 flags.go:64] FLAG: --vmodule="" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600487 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600490 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600589 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600593 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600597 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600600 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600603 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600606 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600608 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600611 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600614 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600616 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600619 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:19.605634 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600621 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600624 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600627 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600629 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600632 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600635 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600638 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600641 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600644 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600646 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600649 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600651 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600656 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600659 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600661 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600664 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600667 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600669 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600672 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600674 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:19.606166 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600677 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600680 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600683 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600685 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600688 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600690 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600693 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600696 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600698 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600701 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600703 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600706 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600709 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600711 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600714 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600716 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600719 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600721 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600726 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:19.606733 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600730 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600733 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600736 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600740 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600743 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600747 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600750 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600753 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600756 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600759 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600762 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600765 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600767 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600769 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600772 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600775 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600777 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600780 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600782 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600785 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:19.607199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600788 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600790 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600793 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600795 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600798 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600801 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600803 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600806 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600808 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600811 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600813 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600817 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600821 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600824 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600827 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:19.607729 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.600829 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.600838 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.606884 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.606899 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606946 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606951 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606955 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606959 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606962 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606965 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606968 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606971 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606973 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606977 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606981 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606984 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:19.608095 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606986 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606990 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606993 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606995 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.606998 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607000 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607003 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607006 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607008 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607011 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607014 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607016 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607019 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607022 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607024 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607027 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607030 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607032 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607035 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607038 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:19.608486 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607041 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607043 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607046 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607048 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607051 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607054 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607056 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607059 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607061 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607064 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607066 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607069 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607071 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607074 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607077 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607081 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607085 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607088 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607091 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:19.608990 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607093 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607096 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607099 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607101 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607105 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607107 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607110 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607113 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607115 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607118 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607120 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607123 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607126 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607129 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607132 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607135 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607137 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607140 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607143 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607145 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:19.609442 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607148 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607150 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607153 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607155 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607158 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607161 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607163 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607165 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607168 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607171 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607173 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607176 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607178 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607181 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607183 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:19.609979 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.607189 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607291 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607296 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607299 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607302 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607305 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607308 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607311 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607314 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607317 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607321 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607323 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607326 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607329 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607331 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607333 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607336 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607338 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607341 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607344 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:19.610344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607346 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607349 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607352 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607354 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607357 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607359 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607362 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607365 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607367 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607370 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607372 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607375 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607377 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607380 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607382 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607385 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607388 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607392 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607395 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:19.610847 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607398 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607401 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607404 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607406 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607409 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607411 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607414 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607416 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607419 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607421 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607424 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607426 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607429 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607432 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607434 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607437 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607439 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607442 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607444 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607447 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:19.611315 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607449 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607452 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607454 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607457 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607459 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607462 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607464 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607467 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607470 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607472 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607474 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607477 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607479 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607482 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607484 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607487 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607490 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607492 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607495 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:19.611872 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607497 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607500 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607502 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607505 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607508 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607510 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607513 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607516 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:19.607519 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.607524 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.608072 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:50:19.612324 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.611589 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:50:19.612699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.612687 2578 server.go:1019] "Starting client certificate rotation" Apr 16 23:50:19.612796 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.612782 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:19.612831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.612814 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:19.643246 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.643230 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:19.645902 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.645884 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:19.662685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.662663 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:50:19.669185 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.669170 2578 log.go:25] "Validated CRI v1 image API" Apr 16 23:50:19.670312 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.670280 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:50:19.674691 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.674674 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:19.674825 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.674802 2578 fs.go:135] Filesystem UUIDs: map[1360b88e-3b10-40d3-9f06-6a08879cbe5a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a0c61b6b-fc74-425f-b25e-aa858ea8150b:/dev/nvme0n1p4] Apr 16 23:50:19.674877 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.674827 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:50:19.680663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.680561 2578 manager.go:217] Machine: {Timestamp:2026-04-16 23:50:19.678433899 +0000 UTC m=+0.453458775 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3114991 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec285899523f17c5b95daa6f1257c3b8 SystemUUID:ec285899-523f-17c5-b95d-aa6f1257c3b8 BootID:33840d59-c0b6-4bfb-befb-c9dc82a11b31 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:65:dc:fa:67:1d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:65:dc:fa:67:1d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:d1:65:60:88:7f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:50:19.680663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.680659 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:50:19.680776 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.680729 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:50:19.681892 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.681871 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:50:19.682027 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.681894 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-103.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:50:19.682075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.682035 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:50:19.682075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.682044 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:50:19.682075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.682061 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:19.683128 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.683116 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:19.684724 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.684713 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:19.684825 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.684816 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:50:19.687174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.687164 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:50:19.687753 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.687742 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:50:19.687790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.687764 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:50:19.687790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.687774 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:50:19.687790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.687782 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:50:19.688520 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.688501 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxg4f" Apr 16 23:50:19.688998 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.688978 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:19.689054 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.689041 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:19.693714 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.693694 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:50:19.695335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.695322 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:50:19.695979 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.695959 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxg4f" Apr 16 23:50:19.697165 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697153 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697171 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697178 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697183 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697189 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697195 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697201 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:50:19.697207 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697206 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:50:19.697412 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697213 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:50:19.697412 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697220 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:50:19.697412 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697228 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:50:19.697412 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.697236 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:50:19.697823 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.697802 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:50:19.697861 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.697846 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:50:19.698156 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.698147 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:50:19.698189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.698158 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:50:19.702031 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.702016 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:50:19.702075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.702057 2578 server.go:1295] "Started kubelet" Apr 16 23:50:19.702171 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.702145 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:50:19.702222 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.702159 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:50:19.702222 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.702221 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:50:19.702967 ip-10-0-134-103 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:50:19.707303 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.707282 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:50:19.709446 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.709426 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:50:19.710593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.710574 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-103.ec2.internal" not found Apr 16 23:50:19.713235 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.713089 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:19.714490 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.714186 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:50:19.716353 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716232 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:50:19.716353 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716264 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:50:19.716459 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716384 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:50:19.716459 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716395 2578 factory.go:55] Registering systemd factory Apr 16 23:50:19.716459 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716412 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:50:19.716710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716450 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:50:19.716710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716654 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:50:19.716833 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716728 2578 factory.go:153] Registering CRI-O factory Apr 16 23:50:19.716833 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716739 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 23:50:19.716833 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716787 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:50:19.716833 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.716809 2578 factory.go:103] Registering Raw factory Apr 16 23:50:19.717026 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.717007 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 23:50:19.717124 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.717044 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:19.717124 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.717106 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 23:50:19.717854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.717836 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:19.718078 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.718064 2578 manager.go:319] Starting recovery of all containers Apr 16 23:50:19.720878 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.720855 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-103.ec2.internal\" not found" node="ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.725742 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.725725 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-103.ec2.internal" not found Apr 16 23:50:19.727712 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.727697 2578 manager.go:324] Recovery completed Apr 16 23:50:19.731436 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.731424 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.734876 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.734855 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.734941 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.734892 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.734941 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.734908 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.735396 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.735379 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:50:19.735471 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.735397 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:50:19.735471 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.735415 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:19.738455 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.738443 2578 policy_none.go:49] "None policy: Start" Apr 16 23:50:19.738493 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.738459 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:50:19.738493 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.738469 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778117 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.778143 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778155 2578 server.go:85] "Starting device plugin registration server" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778352 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778361 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778634 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778700 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.778709 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.779190 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.779226 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:19.794605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.786467 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-103.ec2.internal" not found Apr 16 23:50:19.811384 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.811362 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:50:19.812532 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.812509 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:50:19.812532 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.812533 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:50:19.812685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.812561 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:50:19.812685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.812570 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:50:19.812685 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.812628 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:50:19.815980 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.815958 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:19.878990 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.878904 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.879859 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.879843 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.879956 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.879876 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.879956 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.879890 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.879956 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.879919 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.888998 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.888975 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.889103 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.889001 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-103.ec2.internal\": node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:19.901614 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.901592 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:19.912830 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.912809 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal"] Apr 16 23:50:19.912877 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.912869 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.914407 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.914392 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.914476 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.914420 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.914476 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.914430 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.916388 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.916373 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.916528 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.916514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.916585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.916557 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.917059 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917045 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.917126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917061 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.917126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917070 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.917126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917083 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.917126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917087 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.917126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.917095 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.919029 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.919013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.919076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.919043 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:19.919692 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.919674 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:19.919766 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.919707 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:19.919766 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:19.919722 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:19.940034 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.940017 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-103.ec2.internal\" not found" node="ip-10-0-134-103.ec2.internal" Apr 16 23:50:19.944150 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:19.944135 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-103.ec2.internal\" not found" node="ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.001931 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.001911 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.017169 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.017147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.017257 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.017196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.017257 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.017235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136452127e3fd9ba2b831a29f8633a79-config\") pod \"kube-apiserver-proxy-ip-10-0-134-103.ec2.internal\" (UID: \"136452127e3fd9ba2b831a29f8633a79\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.102749 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.102731 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.118012 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.117988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.118067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.118036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.118067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.118054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.118127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.118072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136452127e3fd9ba2b831a29f8633a79-config\") pod \"kube-apiserver-proxy-ip-10-0-134-103.ec2.internal\" (UID: \"136452127e3fd9ba2b831a29f8633a79\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.118127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.118108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a7393f3c62b6bb14622634c9db6c5c7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal\" (UID: \"8a7393f3c62b6bb14622634c9db6c5c7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.118127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.118110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136452127e3fd9ba2b831a29f8633a79-config\") pod \"kube-apiserver-proxy-ip-10-0-134-103.ec2.internal\" (UID: \"136452127e3fd9ba2b831a29f8633a79\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.203422 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.203367 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.241800 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.241783 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.247558 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.247527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:20.303675 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.303653 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.404120 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.404098 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.504665 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.504603 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.585133 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.585115 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:20.605367 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.605343 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.612623 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.612495 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:50:20.612727 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.612711 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:20.612799 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.612755 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:20.612799 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.612767 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:20.697409 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.697361 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:45:19 +0000 UTC" deadline="2028-01-20 06:05:06.992383218 +0000 UTC" Apr 16 23:50:20.697409 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.697401 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15438h14m46.294986002s" Apr 16 23:50:20.705441 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.705416 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.713593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.713574 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:20.732108 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.732082 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:20.757613 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.757558 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ltp52" Apr 16 23:50:20.762083 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:20.762060 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod136452127e3fd9ba2b831a29f8633a79.slice/crio-ccf30089ac5bf9c0f94998806104c91aa088e5060c76b46aca4f4c451744b6bb WatchSource:0}: Error finding container ccf30089ac5bf9c0f94998806104c91aa088e5060c76b46aca4f4c451744b6bb: Status 404 returned error can't find the container with id ccf30089ac5bf9c0f94998806104c91aa088e5060c76b46aca4f4c451744b6bb Apr 16 23:50:20.762811 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:20.762792 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7393f3c62b6bb14622634c9db6c5c7.slice/crio-c221c08b050faeca05f5a3fb14d1850af2bb82aa244fe6957387069c1e973782 WatchSource:0}: Error finding container c221c08b050faeca05f5a3fb14d1850af2bb82aa244fe6957387069c1e973782: Status 404 returned error can't find the container with id c221c08b050faeca05f5a3fb14d1850af2bb82aa244fe6957387069c1e973782 Apr 16 23:50:20.764759 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.764683 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ltp52" Apr 16 23:50:20.767007 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.766990 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:50:20.806209 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.806181 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:20.814999 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.814964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" event={"ID":"8a7393f3c62b6bb14622634c9db6c5c7","Type":"ContainerStarted","Data":"c221c08b050faeca05f5a3fb14d1850af2bb82aa244fe6957387069c1e973782"} Apr 16 23:50:20.815864 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:20.815846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" event={"ID":"136452127e3fd9ba2b831a29f8633a79","Type":"ContainerStarted","Data":"ccf30089ac5bf9c0f94998806104c91aa088e5060c76b46aca4f4c451744b6bb"} Apr 16 23:50:20.907268 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:20.907245 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-103.ec2.internal\" not found" Apr 16 23:50:21.007009 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.006984 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:21.016967 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.016906 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" Apr 16 23:50:21.028240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.028222 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:21.029969 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.029957 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" Apr 16 23:50:21.036301 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.036289 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:21.688946 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.688902 2578 apiserver.go:52] "Watching apiserver" Apr 16 23:50:21.699370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.699344 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:50:21.700341 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.700299 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-fgzm6","kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal","openshift-dns/node-resolver-xjgf6","openshift-image-registry/node-ca-gxjsh","openshift-multus/multus-2frtz","openshift-multus/network-metrics-daemon-4sczf","openshift-network-diagnostics/network-check-target-sn7xw","openshift-network-operator/iptables-alerter-vwghf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm","openshift-cluster-node-tuning-operator/tuned-lxql5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal","openshift-multus/multus-additional-cni-plugins-5sk6r","openshift-ovn-kubernetes/ovnkube-node-r4p9f"] Apr 16 23:50:21.702903 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.702882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:21.703012 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.702953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:21.707087 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.707063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.709277 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.709255 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.709277 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.709269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.709870 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.709830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gj2zl\"" Apr 16 23:50:21.709983 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.709894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.709983 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.709911 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.711473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.711456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.711585 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.711517 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:21.711650 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.711622 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.711812 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.711780 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:50:21.711884 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.711814 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:50:21.711884 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.711830 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.712487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.712306 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.712487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.712349 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9zcd5\"" Apr 16 23:50:21.712487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.712367 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.712487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.712408 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-46v7t\"" Apr 16 23:50:21.713176 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.713159 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:50:21.713721 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.713628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.715845 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.715826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.716498 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.716431 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wtjp4\"" Apr 16 23:50:21.716498 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.716458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:50:21.716705 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.716569 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:50:21.717835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.717800 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:50:21.718197 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.718172 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lnb8h\"" Apr 16 23:50:21.718954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.718937 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.719078 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.718937 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.720220 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.720198 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.722710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.722574 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zf2rz\"" Apr 16 23:50:21.722710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.722602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:50:21.722710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.722613 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.722935 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.722904 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.723086 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.723067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.723253 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.723234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.725262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.725245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pz5vg\"" Apr 16 23:50:21.725262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.725252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.725460 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.725429 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.725560 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.725496 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.726090 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.726072 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:50:21.726163 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.726099 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:50:21.726163 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.726149 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fj65d\"" Apr 16 23:50:21.726982 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.726949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-k8s-cni-cncf-io\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.726985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/175aebe0-4cfe-44a6-b935-51dd306c4982-iptables-alerter-script\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.727100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dv98\" (UniqueName: \"kubernetes.io/projected/175aebe0-4cfe-44a6-b935-51dd306c4982-kube-api-access-7dv98\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.727100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-socket-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.727100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ssfw\" (UniqueName: \"kubernetes.io/projected/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-kube-api-access-6ssfw\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-hostroot\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-serviceca\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-konnectivity-ca\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/175aebe0-4cfe-44a6-b935-51dd306c4982-host-slash\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gr8g\" (UniqueName: \"kubernetes.io/projected/b3dc2fa1-d372-4847-9980-2930ef815461-kube-api-access-9gr8g\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-system-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-socket-dir-parent\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-bin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46nd\" (UniqueName: \"kubernetes.io/projected/bac2109e-d2f6-42aa-94c6-73a79a2012f0-kube-api-access-l46nd\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-agent-certs\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.727423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-registration-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-sys-fs\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prgtv\" (UniqueName: \"kubernetes.io/projected/850ebbd6-b652-46e6-a7c5-5094395d3a52-kube-api-access-prgtv\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-os-release\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-device-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-multus\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-kubelet\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-conf-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-host\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cnibin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cni-binary-copy\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-netns\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-daemon-config\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfddw\" (UniqueName: \"kubernetes.io/projected/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-kube-api-access-zfddw\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.727992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dc2fa1-d372-4847-9980-2930ef815461-hosts-file\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.728018 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-multus-certs\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.728134 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.728035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-etc-kubernetes\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.729087 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.728070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.729087 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.728097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dc2fa1-d372-4847-9980-2930ef815461-tmp-dir\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.729087 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.728121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.729395 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729222 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:50:21.729395 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729302 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tc9gp\"" Apr 16 23:50:21.729507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:50:21.729609 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729586 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:50:21.729676 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729616 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:50:21.729782 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729762 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:50:21.729872 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.729787 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:50:21.765465 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.765429 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:20 +0000 UTC" deadline="2027-11-27 11:51:02.34583119 +0000 UTC" Apr 16 23:50:21.765465 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.765457 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14148h0m40.580377744s" Apr 16 23:50:21.817105 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.817088 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:50:21.828687 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.828687 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-etc-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-env-overrides\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgw5h\" (UniqueName: \"kubernetes.io/projected/66b93172-3f69-4425-8a38-ff386fb3d1dc-kube-api-access-qgw5h\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-node-log\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-log-socket\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.828834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dc2fa1-d372-4847-9980-2930ef815461-hosts-file\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-multus-certs\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-etc-kubernetes\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-host\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-system-cni-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-cnibin\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-k8s-cni-cncf-io\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.828990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-slash\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-systemd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-ovn\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-netd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dv98\" (UniqueName: \"kubernetes.io/projected/175aebe0-4cfe-44a6-b935-51dd306c4982-kube-api-access-7dv98\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-socket-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829119 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-systemd\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-systemd-units\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-script-lib\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/175aebe0-4cfe-44a6-b935-51dd306c4982-host-slash\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gr8g\" (UniqueName: \"kubernetes.io/projected/b3dc2fa1-d372-4847-9980-2930ef815461-kube-api-access-9gr8g\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-system-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-bin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829370 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b3dc2fa1-d372-4847-9980-2930ef815461-hosts-file\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-k8s-cni-cncf-io\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-bin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-socket-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-multus-certs\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-etc-kubernetes\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovn-node-metrics-cert\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829576 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-system-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-agent-certs\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-sys-fs\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prgtv\" (UniqueName: \"kubernetes.io/projected/850ebbd6-b652-46e6-a7c5-5094395d3a52-kube-api-access-prgtv\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-os-release\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-kubelet\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/175aebe0-4cfe-44a6-b935-51dd306c4982-host-slash\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-conf-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-sys-fs\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.829868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-tmp\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-device-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-kubelet\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lvn\" (UniqueName: \"kubernetes.io/projected/93bb4659-7080-4d48-8726-23205456bff5-kube-api-access-z5lvn\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-os-release\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.829968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-device-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-var-lib-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-conf-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-host\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cnibin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cni-binary-copy\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830109 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cnibin\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-netns\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-host\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-sys\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-run-netns\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.830584 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-netns\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-config\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfddw\" (UniqueName: \"kubernetes.io/projected/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-kube-api-access-zfddw\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysconfig\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-conf\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-etc-tuned\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830415 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dc2fa1-d372-4847-9980-2930ef815461-tmp-dir\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-modprobe-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-etc-selinux\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-cni-dir\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/175aebe0-4cfe-44a6-b935-51dd306c4982-iptables-alerter-script\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ssfw\" (UniqueName: \"kubernetes.io/projected/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-kube-api-access-6ssfw\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-cni-binary-copy\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-lib-modules\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.831304 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-bin\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3dc2fa1-d372-4847-9980-2930ef815461-tmp-dir\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-hostroot\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-os-release\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-kubelet\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-hostroot\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.830972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-serviceca\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/175aebe0-4cfe-44a6-b935-51dd306c4982-iptables-alerter-script\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-konnectivity-ca\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-socket-dir-parent\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.831187 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-daemon-config\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-kubernetes\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.831259 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:22.331226891 +0000 UTC m=+3.106251775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:21.832042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-var-lib-kubelet\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-serviceca\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2sxp\" (UniqueName: \"kubernetes.io/projected/6204e842-fd30-4eb6-be92-04b4429887c1-kube-api-access-k2sxp\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l46nd\" (UniqueName: \"kubernetes.io/projected/bac2109e-d2f6-42aa-94c6-73a79a2012f0-kube-api-access-l46nd\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-socket-dir-parent\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-registration-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-multus\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-host-var-lib-cni-multus\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-konnectivity-ca\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-run\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.831950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/850ebbd6-b652-46e6-a7c5-5094395d3a52-registration-dir\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.832565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.832511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-multus-daemon-config\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.833711 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.833693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ecdc5db-c3e8-477a-8c96-bfa2a4fba192-agent-certs\") pod \"konnectivity-agent-fgzm6\" (UID: \"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192\") " pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:21.837946 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.837922 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:21.840013 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.839929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dv98\" (UniqueName: \"kubernetes.io/projected/175aebe0-4cfe-44a6-b935-51dd306c4982-kube-api-access-7dv98\") pod \"iptables-alerter-vwghf\" (UID: \"175aebe0-4cfe-44a6-b935-51dd306c4982\") " pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:21.841073 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.840737 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:21.841073 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.840764 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:21.841073 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.840782 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:21.841073 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:21.840866 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:22.340848879 +0000 UTC m=+3.115873758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:21.844533 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.844511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ssfw\" (UniqueName: \"kubernetes.io/projected/3cda9bf7-a2b5-4873-acdd-b7f1f28e5295-kube-api-access-6ssfw\") pod \"multus-2frtz\" (UID: \"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295\") " pod="openshift-multus/multus-2frtz" Apr 16 23:50:21.844642 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.844623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfddw\" (UniqueName: \"kubernetes.io/projected/4ad19b4d-6e37-4bb2-adb6-743cb3d95223-kube-api-access-zfddw\") pod \"node-ca-gxjsh\" (UID: \"4ad19b4d-6e37-4bb2-adb6-743cb3d95223\") " pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:21.845413 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.845391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prgtv\" (UniqueName: \"kubernetes.io/projected/850ebbd6-b652-46e6-a7c5-5094395d3a52-kube-api-access-prgtv\") pod \"aws-ebs-csi-driver-node-ttgjm\" (UID: \"850ebbd6-b652-46e6-a7c5-5094395d3a52\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:21.846036 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.845995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gr8g\" (UniqueName: \"kubernetes.io/projected/b3dc2fa1-d372-4847-9980-2930ef815461-kube-api-access-9gr8g\") pod \"node-resolver-xjgf6\" (UID: \"b3dc2fa1-d372-4847-9980-2930ef815461\") " pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:21.846621 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.846530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l46nd\" (UniqueName: \"kubernetes.io/projected/bac2109e-d2f6-42aa-94c6-73a79a2012f0-kube-api-access-l46nd\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:21.896758 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.896733 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:21.932932 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.932904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-var-lib-kubelet\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.932948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2sxp\" (UniqueName: \"kubernetes.io/projected/6204e842-fd30-4eb6-be92-04b4429887c1-kube-api-access-k2sxp\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.932979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-run\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.932986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-var-lib-kubelet\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-etc-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-env-overrides\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw5h\" (UniqueName: \"kubernetes.io/projected/66b93172-3f69-4425-8a38-ff386fb3d1dc-kube-api-access-qgw5h\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-run\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-node-log\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-node-log\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-log-socket\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-host\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-system-cni-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-cnibin\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-etc-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-slash\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-systemd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-ovn\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-netd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.933457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-host\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-cnibin\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-log-socket\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-slash\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-systemd\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-netd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-ovn\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-systemd\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-system-cni-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-systemd-units\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-systemd\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-script-lib\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-systemd-units\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovn-node-metrics-cert\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-env-overrides\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-tmp\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.934230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lvn\" (UniqueName: \"kubernetes.io/projected/93bb4659-7080-4d48-8726-23205456bff5-kube-api-access-z5lvn\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-var-lib-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.933991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-sys\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-var-lib-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-netns\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-config\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysconfig\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-conf\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-etc-tuned\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-sys\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-netns\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-script-lib\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysctl-conf\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-sysconfig\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-modprobe-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-lib-modules\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6204e842-fd30-4eb6-be92-04b4429887c1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovnkube-config\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-bin\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-os-release\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-modprobe-d\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-kubelet\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-cni-bin\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.934993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-kubelet\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6204e842-fd30-4eb6-be92-04b4429887c1-os-release\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-run-openvswitch\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-kubernetes\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.935838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66b93172-3f69-4425-8a38-ff386fb3d1dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.936744 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-etc-kubernetes\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.936744 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.935155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93bb4659-7080-4d48-8726-23205456bff5-lib-modules\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.937083 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.937061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66b93172-3f69-4425-8a38-ff386fb3d1dc-ovn-node-metrics-cert\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:21.937749 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.937726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-etc-tuned\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.937831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.937742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93bb4659-7080-4d48-8726-23205456bff5-tmp\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.944945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.944896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lvn\" (UniqueName: \"kubernetes.io/projected/93bb4659-7080-4d48-8726-23205456bff5-kube-api-access-z5lvn\") pod \"tuned-lxql5\" (UID: \"93bb4659-7080-4d48-8726-23205456bff5\") " pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:21.945249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.945229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2sxp\" (UniqueName: \"kubernetes.io/projected/6204e842-fd30-4eb6-be92-04b4429887c1-kube-api-access-k2sxp\") pod \"multus-additional-cni-plugins-5sk6r\" (UID: \"6204e842-fd30-4eb6-be92-04b4429887c1\") " pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:21.945500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:21.945480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw5h\" (UniqueName: \"kubernetes.io/projected/66b93172-3f69-4425-8a38-ff386fb3d1dc-kube-api-access-qgw5h\") pod \"ovnkube-node-r4p9f\" (UID: \"66b93172-3f69-4425-8a38-ff386fb3d1dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:22.018492 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.018462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xjgf6" Apr 16 23:50:22.024968 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.024880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxjsh" Apr 16 23:50:22.034454 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.034432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2frtz" Apr 16 23:50:22.040086 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.040066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:22.047579 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.047554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vwghf" Apr 16 23:50:22.054078 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.054061 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" Apr 16 23:50:22.059600 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.059580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxql5" Apr 16 23:50:22.066074 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.066057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" Apr 16 23:50:22.070668 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.070650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:22.337497 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.337428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:22.337675 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.337565 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:22.337675 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.337624 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:23.33760832 +0000 UTC m=+4.112633184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:22.402943 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.402917 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod850ebbd6_b652_46e6_a7c5_5094395d3a52.slice/crio-cc892f9b97294f65e23e528cd72b2838fb4e377270b209baafb78ce485b4cef4 WatchSource:0}: Error finding container cc892f9b97294f65e23e528cd72b2838fb4e377270b209baafb78ce485b4cef4: Status 404 returned error can't find the container with id cc892f9b97294f65e23e528cd72b2838fb4e377270b209baafb78ce485b4cef4 Apr 16 23:50:22.405713 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.405680 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cda9bf7_a2b5_4873_acdd_b7f1f28e5295.slice/crio-bffca764112b409ffaa67e3db713e833354eafdec934e782da87d1af12d94232 WatchSource:0}: Error finding container bffca764112b409ffaa67e3db713e833354eafdec934e782da87d1af12d94232: Status 404 returned error can't find the container with id bffca764112b409ffaa67e3db713e833354eafdec934e782da87d1af12d94232 Apr 16 23:50:22.410167 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.410148 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3dc2fa1_d372_4847_9980_2930ef815461.slice/crio-86d435d131be7f66b4a13b71ad5db7c052812cb01c207f0a9eba79b747e42eeb WatchSource:0}: Error finding container 86d435d131be7f66b4a13b71ad5db7c052812cb01c207f0a9eba79b747e42eeb: Status 404 returned error can't find the container with id 86d435d131be7f66b4a13b71ad5db7c052812cb01c207f0a9eba79b747e42eeb Apr 16 23:50:22.410656 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.410635 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175aebe0_4cfe_44a6_b935_51dd306c4982.slice/crio-61ef6cb20c2de0d357a68cc13758361be72138c9a7cd236d1ad642000101f27e WatchSource:0}: Error finding container 61ef6cb20c2de0d357a68cc13758361be72138c9a7cd236d1ad642000101f27e: Status 404 returned error can't find the container with id 61ef6cb20c2de0d357a68cc13758361be72138c9a7cd236d1ad642000101f27e Apr 16 23:50:22.411820 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.411697 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad19b4d_6e37_4bb2_adb6_743cb3d95223.slice/crio-fa723681f25c6fc1f2b65147e1b4209935689edf22f2408eaeac57cb469cd8c5 WatchSource:0}: Error finding container fa723681f25c6fc1f2b65147e1b4209935689edf22f2408eaeac57cb469cd8c5: Status 404 returned error can't find the container with id fa723681f25c6fc1f2b65147e1b4209935689edf22f2408eaeac57cb469cd8c5 Apr 16 23:50:22.412813 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.412308 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6204e842_fd30_4eb6_be92_04b4429887c1.slice/crio-abcd3f285d052e86ca1f148997b5102f6ea0a6d06c530362f216093cb67c316f WatchSource:0}: Error finding container abcd3f285d052e86ca1f148997b5102f6ea0a6d06c530362f216093cb67c316f: Status 404 returned error can't find the container with id abcd3f285d052e86ca1f148997b5102f6ea0a6d06c530362f216093cb67c316f Apr 16 23:50:22.413250 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.413216 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66b93172_3f69_4425_8a38_ff386fb3d1dc.slice/crio-9886664ac87f01550f0f1710bc41e6c88e07fde499e06b9fd6640c19c6ff5066 WatchSource:0}: Error finding container 9886664ac87f01550f0f1710bc41e6c88e07fde499e06b9fd6640c19c6ff5066: Status 404 returned error can't find the container with id 9886664ac87f01550f0f1710bc41e6c88e07fde499e06b9fd6640c19c6ff5066 Apr 16 23:50:22.414795 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.414509 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93bb4659_7080_4d48_8726_23205456bff5.slice/crio-c6bac8ab60fbd3018e7133b4a383d3e1afc2e4e8f69fe950fc0f20470e3a9c7b WatchSource:0}: Error finding container c6bac8ab60fbd3018e7133b4a383d3e1afc2e4e8f69fe950fc0f20470e3a9c7b: Status 404 returned error can't find the container with id c6bac8ab60fbd3018e7133b4a383d3e1afc2e4e8f69fe950fc0f20470e3a9c7b Apr 16 23:50:22.415964 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:50:22.415819 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecdc5db_c3e8_477a_8c96_bfa2a4fba192.slice/crio-b0ebdb7c38bfb476a638ba68efb859e2406bc5dc6c5048bf8dec0e2eaa51a533 WatchSource:0}: Error finding container b0ebdb7c38bfb476a638ba68efb859e2406bc5dc6c5048bf8dec0e2eaa51a533: Status 404 returned error can't find the container with id b0ebdb7c38bfb476a638ba68efb859e2406bc5dc6c5048bf8dec0e2eaa51a533 Apr 16 23:50:22.438012 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.437990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:22.438109 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.438097 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:22.438166 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.438113 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:22.438166 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.438122 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:22.438166 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:22.438162 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:23.438147136 +0000 UTC m=+4.213172006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:22.766399 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.766309 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:20 +0000 UTC" deadline="2028-01-22 12:29:53.841898074 +0000 UTC" Apr 16 23:50:22.766399 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.766351 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15492h39m31.075550358s" Apr 16 23:50:22.822298 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.822265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxql5" event={"ID":"93bb4659-7080-4d48-8726-23205456bff5","Type":"ContainerStarted","Data":"c6bac8ab60fbd3018e7133b4a383d3e1afc2e4e8f69fe950fc0f20470e3a9c7b"} Apr 16 23:50:22.824949 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.824846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerStarted","Data":"abcd3f285d052e86ca1f148997b5102f6ea0a6d06c530362f216093cb67c316f"} Apr 16 23:50:22.831244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.831157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vwghf" event={"ID":"175aebe0-4cfe-44a6-b935-51dd306c4982","Type":"ContainerStarted","Data":"61ef6cb20c2de0d357a68cc13758361be72138c9a7cd236d1ad642000101f27e"} Apr 16 23:50:22.835478 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.835451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgzm6" event={"ID":"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192","Type":"ContainerStarted","Data":"b0ebdb7c38bfb476a638ba68efb859e2406bc5dc6c5048bf8dec0e2eaa51a533"} Apr 16 23:50:22.838319 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.838296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"9886664ac87f01550f0f1710bc41e6c88e07fde499e06b9fd6640c19c6ff5066"} Apr 16 23:50:22.842732 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.842684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxjsh" event={"ID":"4ad19b4d-6e37-4bb2-adb6-743cb3d95223","Type":"ContainerStarted","Data":"fa723681f25c6fc1f2b65147e1b4209935689edf22f2408eaeac57cb469cd8c5"} Apr 16 23:50:22.844887 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.844833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xjgf6" event={"ID":"b3dc2fa1-d372-4847-9980-2930ef815461","Type":"ContainerStarted","Data":"86d435d131be7f66b4a13b71ad5db7c052812cb01c207f0a9eba79b747e42eeb"} Apr 16 23:50:22.847024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.846971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2frtz" event={"ID":"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295","Type":"ContainerStarted","Data":"bffca764112b409ffaa67e3db713e833354eafdec934e782da87d1af12d94232"} Apr 16 23:50:22.849035 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.848969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" event={"ID":"850ebbd6-b652-46e6-a7c5-5094395d3a52","Type":"ContainerStarted","Data":"cc892f9b97294f65e23e528cd72b2838fb4e377270b209baafb78ce485b4cef4"} Apr 16 23:50:22.860356 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.859684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" event={"ID":"136452127e3fd9ba2b831a29f8633a79","Type":"ContainerStarted","Data":"21977326fcc084caceb5427b73dd1cce9338c54af489634bfe89f5bbc4efe52c"} Apr 16 23:50:22.888271 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:22.888246 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:23.343570 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.343478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:23.343726 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.343652 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:23.343726 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.343718 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:25.343698103 +0000 UTC m=+6.118722978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:23.444136 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.444101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:23.444308 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.444287 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:23.444308 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.444306 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:23.444423 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.444318 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:23.444423 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.444378 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:25.444357405 +0000 UTC m=+6.219382284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:23.814178 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.813493 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:23.814178 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.813630 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:23.814178 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.813997 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:23.814178 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:23.814099 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:23.884330 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.883627 2578 generic.go:358] "Generic (PLEG): container finished" podID="8a7393f3c62b6bb14622634c9db6c5c7" containerID="626cf3011f886d049edb4535e33828956c618219f8ac8baaba655cd704ec6a4e" exitCode=0 Apr 16 23:50:23.884330 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.884108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" event={"ID":"8a7393f3c62b6bb14622634c9db6c5c7","Type":"ContainerDied","Data":"626cf3011f886d049edb4535e33828956c618219f8ac8baaba655cd704ec6a4e"} Apr 16 23:50:23.901662 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:23.901476 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-103.ec2.internal" podStartSLOduration=2.9014595659999998 podStartE2EDuration="2.901459566s" podCreationTimestamp="2026-04-16 23:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:22.874631694 +0000 UTC m=+3.649656581" watchObservedRunningTime="2026-04-16 23:50:23.901459566 +0000 UTC m=+4.676484453" Apr 16 23:50:24.896673 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:24.896383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" event={"ID":"8a7393f3c62b6bb14622634c9db6c5c7","Type":"ContainerStarted","Data":"71e73dd1dede10ed8cfa27be649f3a1c0034bfbd8354c948511bbc917c0b53ba"} Apr 16 23:50:24.909724 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:24.909677 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-103.ec2.internal" podStartSLOduration=3.909660182 podStartE2EDuration="3.909660182s" podCreationTimestamp="2026-04-16 23:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:24.908712856 +0000 UTC m=+5.683737742" watchObservedRunningTime="2026-04-16 23:50:24.909660182 +0000 UTC m=+5.684685071" Apr 16 23:50:25.363547 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:25.363452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:25.363705 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.363661 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:25.363765 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.363731 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:29.363711981 +0000 UTC m=+10.138736856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:25.464748 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:25.464103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:25.464748 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.464298 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:25.464748 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.464320 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:25.464748 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.464334 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:25.464748 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.464394 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:29.46437455 +0000 UTC m=+10.239399430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:25.813619 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:25.813435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:25.813619 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:25.813442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:25.813619 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.813569 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:25.813875 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:25.813659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:27.813432 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:27.813405 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:27.813891 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:27.813514 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:27.813891 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:27.813619 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:27.813891 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:27.813719 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:28.571565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.570772 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gzjrj"] Apr 16 23:50:28.579273 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.579246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.579414 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:28.579330 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:28.688715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.688675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-kubelet-config\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.688880 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.688729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.688880 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.688795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-dbus\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.789526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-kubelet-config\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.789589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.789613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-kubelet-config\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.789668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-dbus\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:28.789712 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:28.789767 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:50:29.289750515 +0000 UTC m=+10.064775390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:28.789945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:28.789834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-dbus\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:29.294387 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.294351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:29.294836 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.294521 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:29.294836 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.294598 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:50:30.294579034 +0000 UTC m=+11.069603915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:29.395182 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.395139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:29.395367 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.395338 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:29.395431 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.395416 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:37.395395143 +0000 UTC m=+18.170420025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:29.496157 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.496071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:29.496325 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.496236 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:29.496325 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.496253 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:29.496325 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.496265 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:29.496325 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.496320 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:37.496302575 +0000 UTC m=+18.271327444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.814876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.814901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.814992 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:29.815024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.815115 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:29.816352 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:29.815213 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:30.303674 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:30.303626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:30.304162 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:30.303833 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:30.304162 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:30.303894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:50:32.303875102 +0000 UTC m=+13.078899987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:31.812935 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:31.812851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:31.813361 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:31.812958 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:31.813361 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:31.812852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:31.813361 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:31.813100 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:31.813361 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:31.812852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:31.813361 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:31.813190 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:32.317437 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:32.317402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:32.317652 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:32.317571 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:32.317652 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:32.317642 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:50:36.317621978 +0000 UTC m=+17.092646842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:33.813118 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:33.813081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:33.813659 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:33.813121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:33.813659 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:33.813081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:33.813659 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:33.813185 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:33.813659 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:33.813247 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:33.813659 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:33.813333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:35.813066 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:35.813015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:35.813461 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:35.813017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:35.813461 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:35.813131 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:35.813461 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:35.813023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:35.813461 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:35.813225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:35.813461 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:35.813324 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:36.348360 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:36.348325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:36.348525 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:36.348468 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:36.348592 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:36.348530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:50:44.348510059 +0000 UTC m=+25.123534941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:37.455814 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:37.455766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:37.456228 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.455896 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:37.456228 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.455967 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.455944641 +0000 UTC m=+34.230969521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:37.556713 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:37.556673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:37.556883 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.556824 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:37.556883 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.556843 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:37.556883 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.556856 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:37.557021 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.556929 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.556909672 +0000 UTC m=+34.331934536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:37.813821 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:37.813733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:37.813978 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:37.813876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:37.813978 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.813890 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:37.814090 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.813988 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:37.814090 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:37.814040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:37.814195 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:37.814105 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:39.813685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.813505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:39.814400 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.813581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:39.814400 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:39.813845 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:39.814400 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:39.813747 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:39.814400 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.813599 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:39.814400 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:39.813981 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:39.921159 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.921125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgzm6" event={"ID":"5ecdc5db-c3e8-477a-8c96-bfa2a4fba192","Type":"ContainerStarted","Data":"af7e7166950bbf512916ee7d48b758a906af915061a2a096dbb946030eab5666"} Apr 16 23:50:39.922959 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.922943 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:50:39.923265 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.923244 2578 generic.go:358] "Generic (PLEG): container finished" podID="66b93172-3f69-4425-8a38-ff386fb3d1dc" containerID="b9b10ee5592269c3c3f2b4e150404e730242ddd22e4514dfdce11048a862caef" exitCode=1 Apr 16 23:50:39.923367 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.923316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"12ef55d9a6eef2945fa90b03d7c53d838a5e456ba061df5baaab1246ded839a9"} Apr 16 23:50:39.923367 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.923340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerDied","Data":"b9b10ee5592269c3c3f2b4e150404e730242ddd22e4514dfdce11048a862caef"} Apr 16 23:50:39.923367 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.923355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"274f49316e4c1d03ad65fafb470e6d0c467fc867bb0e71282cb49783ff1273a6"} Apr 16 23:50:39.924633 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.924600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxjsh" event={"ID":"4ad19b4d-6e37-4bb2-adb6-743cb3d95223","Type":"ContainerStarted","Data":"9759b6809584b99e1c46095481ffd840177cb8f919b687b2ba28d7620d5f22cc"} Apr 16 23:50:39.925739 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.925712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xjgf6" event={"ID":"b3dc2fa1-d372-4847-9980-2930ef815461","Type":"ContainerStarted","Data":"48203dad546a88fccb959cd5fcc3e7ec7e6236a6993665e846c5d844dea5f79f"} Apr 16 23:50:39.926772 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.926750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2frtz" event={"ID":"3cda9bf7-a2b5-4873-acdd-b7f1f28e5295","Type":"ContainerStarted","Data":"90aba4ad9ed861d46056e3853ceed29ecaf7308e33d26fd4091bbe5620e865bc"} Apr 16 23:50:39.927853 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.927834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" event={"ID":"850ebbd6-b652-46e6-a7c5-5094395d3a52","Type":"ContainerStarted","Data":"2b22a8e217a4d7530c0572af89a661907e6e5322986be6b78803bee1c72b0fd1"} Apr 16 23:50:39.929032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.929002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxql5" event={"ID":"93bb4659-7080-4d48-8726-23205456bff5","Type":"ContainerStarted","Data":"e4cd77bf478897b2af7e904111f11bfc16237c4a6eee7b27ff9563de56180cb3"} Apr 16 23:50:39.930187 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.930167 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="b67d5c1f06c4b770a573823603a8e31dc5bd4e3e72c09a125469057c9e40997d" exitCode=0 Apr 16 23:50:39.930275 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.930195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"b67d5c1f06c4b770a573823603a8e31dc5bd4e3e72c09a125469057c9e40997d"} Apr 16 23:50:39.934385 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.934348 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fgzm6" podStartSLOduration=3.997294551 podStartE2EDuration="20.934337661s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.417810109 +0000 UTC m=+3.192834978" lastFinishedPulling="2026-04-16 23:50:39.35485321 +0000 UTC m=+20.129878088" observedRunningTime="2026-04-16 23:50:39.933972068 +0000 UTC m=+20.708996954" watchObservedRunningTime="2026-04-16 23:50:39.934337661 +0000 UTC m=+20.709362546" Apr 16 23:50:39.977968 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.977923 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xjgf6" podStartSLOduration=4.034986682 podStartE2EDuration="20.977908933s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.411960907 +0000 UTC m=+3.186985770" lastFinishedPulling="2026-04-16 23:50:39.354883145 +0000 UTC m=+20.129908021" observedRunningTime="2026-04-16 23:50:39.965964583 +0000 UTC m=+20.740989472" watchObservedRunningTime="2026-04-16 23:50:39.977908933 +0000 UTC m=+20.752933818" Apr 16 23:50:39.996820 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:39.996775 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gxjsh" podStartSLOduration=11.996630049 podStartE2EDuration="20.996759395s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.414170913 +0000 UTC m=+3.189195778" lastFinishedPulling="2026-04-16 23:50:31.414300249 +0000 UTC m=+12.189325124" observedRunningTime="2026-04-16 23:50:39.978945839 +0000 UTC m=+20.753970725" watchObservedRunningTime="2026-04-16 23:50:39.996759395 +0000 UTC m=+20.771784272" Apr 16 23:50:40.012135 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.012093 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lxql5" podStartSLOduration=4.073749774 podStartE2EDuration="21.012078246s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.416885187 +0000 UTC m=+3.191910054" lastFinishedPulling="2026-04-16 23:50:39.355213662 +0000 UTC m=+20.130238526" observedRunningTime="2026-04-16 23:50:40.011885797 +0000 UTC m=+20.786910683" watchObservedRunningTime="2026-04-16 23:50:40.012078246 +0000 UTC m=+20.787103134" Apr 16 23:50:40.012590 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.012554 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2frtz" podStartSLOduration=4.028259691 podStartE2EDuration="21.012532353s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.408509664 +0000 UTC m=+3.183534538" lastFinishedPulling="2026-04-16 23:50:39.39278232 +0000 UTC m=+20.167807200" observedRunningTime="2026-04-16 23:50:39.99689875 +0000 UTC m=+20.771923629" watchObservedRunningTime="2026-04-16 23:50:40.012532353 +0000 UTC m=+20.787557241" Apr 16 23:50:40.934705 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.934518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:50:40.935127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.935100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"bcf19fb55b44ae728ddd364c9710759e4678f76f992c5cc3a1181e1f9322375a"} Apr 16 23:50:40.935194 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.935140 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"00bd1a1c7e2b01c589063e722606618e6c50656c0aa348195722d26af17ce42b"} Apr 16 23:50:40.935194 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.935154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"e6213b60712ec0f2ea227f9ff0f0ba18716f21cccff189ea7faa4d3602c295ee"} Apr 16 23:50:40.972887 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:40.972855 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:50:41.634309 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.634278 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:41.634910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.634892 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:41.789066 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.788968 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:50:40.972872738Z","UUID":"ff52a255-555a-4f18-999d-e6cfdbd011d8","Handler":null,"Name":"","Endpoint":""} Apr 16 23:50:41.792314 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.792280 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:50:41.792314 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.792322 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:50:41.812823 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.812784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:41.812943 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.812853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:41.812943 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.812854 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:41.813069 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:41.812965 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:41.813147 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:41.813121 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:41.813224 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:41.813210 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:41.938211 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.938145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" event={"ID":"850ebbd6-b652-46e6-a7c5-5094395d3a52","Type":"ContainerStarted","Data":"2618211bab607be47cadadac97ba4da9e3e206a051116cf35a20237083fa8bf6"} Apr 16 23:50:41.939883 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.939842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vwghf" event={"ID":"175aebe0-4cfe-44a6-b935-51dd306c4982","Type":"ContainerStarted","Data":"ba55ea311af834ff3ea461c9c82a4026656cd1b5907d5780b01ffb834580c5ba"} Apr 16 23:50:41.940356 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.940290 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:41.940675 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.940653 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fgzm6" Apr 16 23:50:41.953747 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:41.953696 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vwghf" podStartSLOduration=6.011617269 podStartE2EDuration="22.953682984s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.412824875 +0000 UTC m=+3.187849751" lastFinishedPulling="2026-04-16 23:50:39.354890598 +0000 UTC m=+20.129915466" observedRunningTime="2026-04-16 23:50:41.953257112 +0000 UTC m=+22.728281997" watchObservedRunningTime="2026-04-16 23:50:41.953682984 +0000 UTC m=+22.728707871" Apr 16 23:50:42.944518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:42.944485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:50:42.945076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:42.944878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"c50eda881f9f29ac37deabd35dda6f1d5b49d3547e0190a5fbc1c3cc3499504a"} Apr 16 23:50:42.946857 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:42.946830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" event={"ID":"850ebbd6-b652-46e6-a7c5-5094395d3a52","Type":"ContainerStarted","Data":"b2d35f11087b87ed20494ecbac279114e7b94421be40b518949f50af1002684a"} Apr 16 23:50:42.964688 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:42.964652 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ttgjm" podStartSLOduration=4.281445389 podStartE2EDuration="23.964641198s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.405875072 +0000 UTC m=+3.180899950" lastFinishedPulling="2026-04-16 23:50:42.089070891 +0000 UTC m=+22.864095759" observedRunningTime="2026-04-16 23:50:42.964291501 +0000 UTC m=+23.739316390" watchObservedRunningTime="2026-04-16 23:50:42.964641198 +0000 UTC m=+23.739666083" Apr 16 23:50:43.813507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:43.813475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:43.813701 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:43.813476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:43.813701 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:43.813604 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:43.813701 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:43.813475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:43.813701 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:43.813689 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:43.813902 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:43.813778 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:44.414303 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.414274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:44.414628 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:44.414442 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:44.414628 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:44.414519 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret podName:4ec875de-2bac-4b6f-82a6-4e9a79ae830e nodeName:}" failed. No retries permitted until 2026-04-16 23:51:00.414499534 +0000 UTC m=+41.189524405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret") pod "global-pull-secret-syncer-gzjrj" (UID: "4ec875de-2bac-4b6f-82a6-4e9a79ae830e") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:50:44.951939 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.951719 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="c1ebb1e105466581be755c9f93d30b4df6b1dc935e8e7879a6bba978e47cd108" exitCode=0 Apr 16 23:50:44.952106 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.951808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"c1ebb1e105466581be755c9f93d30b4df6b1dc935e8e7879a6bba978e47cd108"} Apr 16 23:50:44.957940 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.957920 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:50:44.958220 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.958199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"4a0d1e05a5e1447017cf6891d79f7e469e6ebdaee575c551b3aff1dabee7ee32"} Apr 16 23:50:44.958429 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.958412 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:44.958518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.958503 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:44.958614 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.958598 2578 scope.go:117] "RemoveContainer" containerID="b9b10ee5592269c3c3f2b4e150404e730242ddd22e4514dfdce11048a862caef" Apr 16 23:50:44.974343 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:44.974328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:45.812965 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.812944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:45.813272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.812944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:45.813272 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:45.813038 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:45.813272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.813065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:45.813272 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:45.813182 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:45.813401 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:45.813292 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:45.962760 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.962469 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="e2ed0cccfd3d58de5f709807a49e0b999cfe0f5c9bd5136d778a042e0ff09eda" exitCode=0 Apr 16 23:50:45.962760 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.962565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"e2ed0cccfd3d58de5f709807a49e0b999cfe0f5c9bd5136d778a042e0ff09eda"} Apr 16 23:50:45.966666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.966644 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:50:45.967296 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.967268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" event={"ID":"66b93172-3f69-4425-8a38-ff386fb3d1dc","Type":"ContainerStarted","Data":"7246bdbead69ac23b1230d6c39aef22e5454da39f3ab872c33654ea46772492e"} Apr 16 23:50:45.967663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.967629 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:45.981180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:45.981163 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:50:46.423787 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.423633 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" podStartSLOduration=10.393326111 podStartE2EDuration="27.423614571s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.415530059 +0000 UTC m=+3.190554925" lastFinishedPulling="2026-04-16 23:50:39.445818507 +0000 UTC m=+20.220843385" observedRunningTime="2026-04-16 23:50:46.008167761 +0000 UTC m=+26.783192646" watchObservedRunningTime="2026-04-16 23:50:46.423614571 +0000 UTC m=+27.198639458" Apr 16 23:50:46.424569 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.424523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gzjrj"] Apr 16 23:50:46.424699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.424669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:46.424807 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:46.424781 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:46.427033 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.427006 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sn7xw"] Apr 16 23:50:46.427133 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.427118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:46.427249 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:46.427225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:46.427566 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.427523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4sczf"] Apr 16 23:50:46.427666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.427653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:46.427805 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:46.427760 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:46.970616 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.970524 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="a01e8c738fc173163a07fd884f58b572c84926e3dc4bd02660233c2ded49077b" exitCode=0 Apr 16 23:50:46.970616 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:46.970600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"a01e8c738fc173163a07fd884f58b572c84926e3dc4bd02660233c2ded49077b"} Apr 16 23:50:47.813194 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:47.813162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:47.813365 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:47.813172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:47.813365 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:47.813266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:47.813365 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:47.813170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:47.813532 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:47.813375 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:47.813532 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:47.813421 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:49.813955 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:49.813887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:49.813955 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:49.813906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:49.814583 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:49.813993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:49.814583 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:49.813996 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:49.814583 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:49.814118 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:49.814583 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:49.814197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:51.813363 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:51.813333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:51.813829 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:51.813334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:51.813829 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:51.813445 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sn7xw" podUID="c0ff83a8-1253-44cb-b3ea-b43cca82f094" Apr 16 23:50:51.813829 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:51.813339 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:51.813829 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:51.813561 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gzjrj" podUID="4ec875de-2bac-4b6f-82a6-4e9a79ae830e" Apr 16 23:50:51.813829 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:51.813661 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:50:52.575219 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.575044 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-103.ec2.internal" event="NodeReady" Apr 16 23:50:52.575380 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.575289 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:50:52.613414 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.613392 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xcndm"] Apr 16 23:50:52.630128 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.630013 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w4vbz"] Apr 16 23:50:52.630245 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.630176 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.632567 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.632523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:50:52.632567 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.632555 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:50:52.632845 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.632826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:50:52.642181 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.642157 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcndm"] Apr 16 23:50:52.642281 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.642191 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w4vbz"] Apr 16 23:50:52.642343 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.642286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.644917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.644693 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:50:52.644917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.644705 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:50:52.644917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.644705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:50:52.644917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.644727 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:50:52.782032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.781997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.782186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.782073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e003096-f002-43cb-9237-3811ca14f285-tmp-dir\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.782186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.782134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bkgw\" (UniqueName: \"kubernetes.io/projected/48793279-1866-40db-8e3c-e2c46e4d6f6d-kube-api-access-2bkgw\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.782186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.782171 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.782320 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.782214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e003096-f002-43cb-9237-3811ca14f285-config-volume\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.782320 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.782241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chjvj\" (UniqueName: \"kubernetes.io/projected/2e003096-f002-43cb-9237-3811ca14f285-kube-api-access-chjvj\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882592 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e003096-f002-43cb-9237-3811ca14f285-tmp-dir\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bkgw\" (UniqueName: \"kubernetes.io/projected/48793279-1866-40db-8e3c-e2c46e4d6f6d-kube-api-access-2bkgw\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e003096-f002-43cb-9237-3811ca14f285-config-volume\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chjvj\" (UniqueName: \"kubernetes.io/projected/2e003096-f002-43cb-9237-3811ca14f285-kube-api-access-chjvj\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:52.882818 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.882871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e003096-f002-43cb-9237-3811ca14f285-tmp-dir\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:52.882899 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:52.882922 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:52.882919 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.382883207 +0000 UTC m=+34.157908085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:50:52.883279 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:52.882953 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:53.382939142 +0000 UTC m=+34.157964014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:50:52.883318 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.883286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e003096-f002-43cb-9237-3811ca14f285-config-volume\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.892352 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.892327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chjvj\" (UniqueName: \"kubernetes.io/projected/2e003096-f002-43cb-9237-3811ca14f285-kube-api-access-chjvj\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:52.892587 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.892572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bkgw\" (UniqueName: \"kubernetes.io/projected/48793279-1866-40db-8e3c-e2c46e4d6f6d-kube-api-access-2bkgw\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:52.983095 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.983072 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="f0cbf2c9c29b8374a3af3d741d98101aa0df70e8b9d628a2abe69bb167e04b8c" exitCode=0 Apr 16 23:50:52.983221 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:52.983109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"f0cbf2c9c29b8374a3af3d741d98101aa0df70e8b9d628a2abe69bb167e04b8c"} Apr 16 23:50:53.386471 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.386403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:53.386471 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.386453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:53.386723 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.386534 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:53.386723 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.386556 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:53.386723 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.386604 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:54.386590787 +0000 UTC m=+35.161615651 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:50:53.386723 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.386617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:50:54.386611105 +0000 UTC m=+35.161635969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:50:53.487278 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.487258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:53.487363 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.487351 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:53.487402 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.487396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:25.48738059 +0000 UTC m=+66.262405472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:53.588216 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.588195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:53.588311 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.588304 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:53.588348 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.588316 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:53.588348 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.588325 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6tj95 for pod openshift-network-diagnostics/network-check-target-sn7xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:53.588411 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:53.588361 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95 podName:c0ff83a8-1253-44cb-b3ea-b43cca82f094 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:25.58835066 +0000 UTC m=+66.363375524 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tj95" (UniqueName: "kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95") pod "network-check-target-sn7xw" (UID: "c0ff83a8-1253-44cb-b3ea-b43cca82f094") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:53.812924 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.812901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:50:53.813054 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.812908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:50:53.813218 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.812908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:50:53.815526 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.815502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:50:53.815640 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.815528 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:50:53.815640 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.815549 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:50:53.816705 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.816686 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:50:53.816817 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.816799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:50:53.816871 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.816855 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:50:53.990699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.990670 2578 generic.go:358] "Generic (PLEG): container finished" podID="6204e842-fd30-4eb6-be92-04b4429887c1" containerID="f088100442cf0de5b440dfbec877e6d02ddb1f30b0b1abb1fdd8115a58d099d5" exitCode=0 Apr 16 23:50:53.991004 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:53.990729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerDied","Data":"f088100442cf0de5b440dfbec877e6d02ddb1f30b0b1abb1fdd8115a58d099d5"} Apr 16 23:50:54.393303 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:54.393242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:54.393303 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:54.393299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:54.393440 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:54.393373 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:54.393440 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:54.393383 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:54.393440 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:54.393429 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:50:56.393415877 +0000 UTC m=+37.168440742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:50:54.393440 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:54.393441 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:56.393436208 +0000 UTC m=+37.168461072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:50:54.995424 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:54.995397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" event={"ID":"6204e842-fd30-4eb6-be92-04b4429887c1","Type":"ContainerStarted","Data":"d36e0409c19e0b929d45b0275f7b74098c23332b671725ac59c87463a1d5ea2e"} Apr 16 23:50:55.019790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:55.019737 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5sk6r" podStartSLOduration=5.846264262 podStartE2EDuration="36.019724223s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:50:22.414658061 +0000 UTC m=+3.189682925" lastFinishedPulling="2026-04-16 23:50:52.588118021 +0000 UTC m=+33.363142886" observedRunningTime="2026-04-16 23:50:55.019385275 +0000 UTC m=+35.794410161" watchObservedRunningTime="2026-04-16 23:50:55.019724223 +0000 UTC m=+35.794749107" Apr 16 23:50:56.407473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:56.407440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:50:56.407878 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:50:56.407487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:50:56.407878 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:56.407603 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:50:56.407878 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:56.407609 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:50:56.407878 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:56.407653 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:00.407639913 +0000 UTC m=+41.182664778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:50:56.407878 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:50:56.407667 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:51:00.407660536 +0000 UTC m=+41.182685399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:51:00.434685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.434640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:51:00.434685 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.434688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:51:00.435096 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.434721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:51:00.435096 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:00.434805 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:00.435096 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:00.434847 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:00.435096 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:00.434854 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:08.434841265 +0000 UTC m=+49.209866130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:51:00.435096 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:00.434907 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:51:08.434891542 +0000 UTC m=+49.209916407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:51:00.437724 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.437702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ec875de-2bac-4b6f-82a6-4e9a79ae830e-original-pull-secret\") pod \"global-pull-secret-syncer-gzjrj\" (UID: \"4ec875de-2bac-4b6f-82a6-4e9a79ae830e\") " pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:51:00.732222 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.732127 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gzjrj" Apr 16 23:51:00.852049 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:00.851984 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gzjrj"] Apr 16 23:51:00.856007 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:51:00.855977 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec875de_2bac_4b6f_82a6_4e9a79ae830e.slice/crio-81ba0800d62e90e8efd3a5af41f74822d438b5507586981cdfb0d7ff22a3c50d WatchSource:0}: Error finding container 81ba0800d62e90e8efd3a5af41f74822d438b5507586981cdfb0d7ff22a3c50d: Status 404 returned error can't find the container with id 81ba0800d62e90e8efd3a5af41f74822d438b5507586981cdfb0d7ff22a3c50d Apr 16 23:51:01.007260 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:01.007217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gzjrj" event={"ID":"4ec875de-2bac-4b6f-82a6-4e9a79ae830e","Type":"ContainerStarted","Data":"81ba0800d62e90e8efd3a5af41f74822d438b5507586981cdfb0d7ff22a3c50d"} Apr 16 23:51:06.018114 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:06.018078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gzjrj" event={"ID":"4ec875de-2bac-4b6f-82a6-4e9a79ae830e","Type":"ContainerStarted","Data":"6a3232b4fbfe5643af862d89846d51ecd1aaddeecfa86c824511c54d5e6e5e9b"} Apr 16 23:51:06.031605 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:06.031527 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gzjrj" podStartSLOduration=33.904452957 podStartE2EDuration="38.031515215s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:51:00.857570603 +0000 UTC m=+41.632595467" lastFinishedPulling="2026-04-16 23:51:04.984632857 +0000 UTC m=+45.759657725" observedRunningTime="2026-04-16 23:51:06.031154969 +0000 UTC m=+46.806179833" watchObservedRunningTime="2026-04-16 23:51:06.031515215 +0000 UTC m=+46.806540100" Apr 16 23:51:08.494714 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:08.494678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:51:08.495059 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:08.494738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:51:08.495059 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:08.494817 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:08.495059 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:08.494819 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:08.495059 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:08.494868 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:51:24.494853168 +0000 UTC m=+65.269878033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:51:08.495059 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:08.494881 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:24.494875783 +0000 UTC m=+65.269900646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:51:17.983259 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:17.983233 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r4p9f" Apr 16 23:51:24.495846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:24.495802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:51:24.495846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:24.495862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:51:24.496377 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:24.495948 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:24.496377 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:24.495962 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:24.496377 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:24.496006 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:56.495992485 +0000 UTC m=+97.271017350 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:51:24.496377 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:24.496053 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:51:56.496035829 +0000 UTC m=+97.271060705 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:51:25.501443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.501396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:51:25.503952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.503925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:51:25.511804 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:25.511789 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:51:25.511887 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:25.511842 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:29.511825819 +0000 UTC m=+130.286850685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : secret "metrics-daemon-secret" not found Apr 16 23:51:25.602544 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.602511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:51:25.604984 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.604968 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:51:25.614923 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.614904 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:51:25.626152 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.626132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tj95\" (UniqueName: \"kubernetes.io/projected/c0ff83a8-1253-44cb-b3ea-b43cca82f094-kube-api-access-6tj95\") pod \"network-check-target-sn7xw\" (UID: \"c0ff83a8-1253-44cb-b3ea-b43cca82f094\") " pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:51:25.629895 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.629876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:51:25.638756 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.638739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:51:25.758732 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:25.758666 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sn7xw"] Apr 16 23:51:25.762344 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:51:25.762315 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ff83a8_1253_44cb_b3ea_b43cca82f094.slice/crio-e94190e66fe44ede5bccb8b57c58513d0d20d64ef2b6ef78ef472bb021e1d310 WatchSource:0}: Error finding container e94190e66fe44ede5bccb8b57c58513d0d20d64ef2b6ef78ef472bb021e1d310: Status 404 returned error can't find the container with id e94190e66fe44ede5bccb8b57c58513d0d20d64ef2b6ef78ef472bb021e1d310 Apr 16 23:51:26.057208 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:26.057128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sn7xw" event={"ID":"c0ff83a8-1253-44cb-b3ea-b43cca82f094","Type":"ContainerStarted","Data":"e94190e66fe44ede5bccb8b57c58513d0d20d64ef2b6ef78ef472bb021e1d310"} Apr 16 23:51:29.063335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:29.063300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sn7xw" event={"ID":"c0ff83a8-1253-44cb-b3ea-b43cca82f094","Type":"ContainerStarted","Data":"68d026c49356007fbdb35b9524a3f1e69dc12581ed708522157abf828f582fb4"} Apr 16 23:51:29.063710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:29.063453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:51:29.078189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:29.078059 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sn7xw" podStartSLOduration=67.562931275 podStartE2EDuration="1m10.078044963s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:51:25.764139927 +0000 UTC m=+66.539164792" lastFinishedPulling="2026-04-16 23:51:28.279253612 +0000 UTC m=+69.054278480" observedRunningTime="2026-04-16 23:51:29.077933274 +0000 UTC m=+69.852958161" watchObservedRunningTime="2026-04-16 23:51:29.078044963 +0000 UTC m=+69.853069850" Apr 16 23:51:56.506419 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:56.506386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:51:56.506819 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:51:56.506427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:51:56.506819 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:56.506510 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:56.506819 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:56.506513 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:56.506819 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:56.506583 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls podName:2e003096-f002-43cb-9237-3811ca14f285 nodeName:}" failed. No retries permitted until 2026-04-16 23:53:00.506570158 +0000 UTC m=+161.281595022 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls") pod "dns-default-xcndm" (UID: "2e003096-f002-43cb-9237-3811ca14f285") : secret "dns-default-metrics-tls" not found Apr 16 23:51:56.506819 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:51:56.506598 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert podName:48793279-1866-40db-8e3c-e2c46e4d6f6d nodeName:}" failed. No retries permitted until 2026-04-16 23:53:00.506591983 +0000 UTC m=+161.281616846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert") pod "ingress-canary-w4vbz" (UID: "48793279-1866-40db-8e3c-e2c46e4d6f6d") : secret "canary-serving-cert" not found Apr 16 23:52:00.067486 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:00.067454 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sn7xw" Apr 16 23:52:16.131559 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.131515 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z"] Apr 16 23:52:16.133092 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.133076 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" Apr 16 23:52:16.135436 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.135414 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.136524 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.136499 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.136729 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.136551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-2br2d\"" Apr 16 23:52:16.176268 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.176243 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z"] Apr 16 23:52:16.231424 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.231401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvcf\" (UniqueName: \"kubernetes.io/projected/222e7080-b3ea-4699-8dc0-7f118d6c305f-kube-api-access-8zvcf\") pod \"volume-data-source-validator-7c6cbb6c87-w8z9z\" (UID: \"222e7080-b3ea-4699-8dc0-7f118d6c305f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" Apr 16 23:52:16.236522 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.236502 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2"] Apr 16 23:52:16.238102 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.238085 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.240686 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.240662 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69ddbcffcb-4q64j"] Apr 16 23:52:16.240780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.240752 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.244180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.241401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.244180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.241486 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cltfn\"" Apr 16 23:52:16.244180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.241943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 23:52:16.244180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.242855 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 23:52:16.244180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.243702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.246835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.246807 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 23:52:16.246835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.246808 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 23:52:16.246993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.246894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 23:52:16.246993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.246947 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.247141 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.247128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 23:52:16.247201 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.247189 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.247249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.247211 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bpg9j\"" Apr 16 23:52:16.252046 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.252026 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2"] Apr 16 23:52:16.258032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.258013 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69ddbcffcb-4q64j"] Apr 16 23:52:16.332566 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/675389d4-0616-4e3c-8d9d-a1d6f5247035-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.332686 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.332686 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-default-certificate\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.332686 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7q6t\" (UniqueName: \"kubernetes.io/projected/675389d4-0616-4e3c-8d9d-a1d6f5247035-kube-api-access-l7q6t\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.332789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.332789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvcf\" (UniqueName: \"kubernetes.io/projected/222e7080-b3ea-4699-8dc0-7f118d6c305f-kube-api-access-8zvcf\") pod \"volume-data-source-validator-7c6cbb6c87-w8z9z\" (UID: \"222e7080-b3ea-4699-8dc0-7f118d6c305f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" Apr 16 23:52:16.332789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-stats-auth\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.332789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95b56\" (UniqueName: \"kubernetes.io/projected/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-kube-api-access-95b56\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.332902 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.332799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.334739 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.334716 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh"] Apr 16 23:52:16.336246 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.336231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.338375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.338355 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 23:52:16.338375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.338372 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-qh88c\"" Apr 16 23:52:16.338556 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.338438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 23:52:16.338687 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.338669 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.338687 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.338680 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.346762 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.346743 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh"] Apr 16 23:52:16.353265 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.353247 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvcf\" (UniqueName: \"kubernetes.io/projected/222e7080-b3ea-4699-8dc0-7f118d6c305f-kube-api-access-8zvcf\") pod \"volume-data-source-validator-7c6cbb6c87-w8z9z\" (UID: \"222e7080-b3ea-4699-8dc0-7f118d6c305f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" Apr 16 23:52:16.433768 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.433768 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bskdv\" (UniqueName: \"kubernetes.io/projected/63de64d5-ece3-4665-8f9c-1e5bc54f3018-kube-api-access-bskdv\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-stats-auth\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95b56\" (UniqueName: \"kubernetes.io/projected/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-kube-api-access-95b56\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63de64d5-ece3-4665-8f9c-1e5bc54f3018-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.433871 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:52:16.433922 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.433899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/675389d4-0616-4e3c-8d9d-a1d6f5247035-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.434013 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:16.93399178 +0000 UTC m=+117.709016666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : configmap references non-existent config key: service-ca.crt Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.434044 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:16.934028658 +0000 UTC m=+117.709053523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : secret "router-metrics-certs-default" not found Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.434074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.434104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-default-certificate\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.434143 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7q6t\" (UniqueName: \"kubernetes.io/projected/675389d4-0616-4e3c-8d9d-a1d6f5247035-kube-api-access-l7q6t\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.434189 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.434183 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:16.434484 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.434183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63de64d5-ece3-4665-8f9c-1e5bc54f3018-config\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.434484 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.434232 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls podName:675389d4-0616-4e3c-8d9d-a1d6f5247035 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:16.934216963 +0000 UTC m=+117.709241835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v78q2" (UID: "675389d4-0616-4e3c-8d9d-a1d6f5247035") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:16.434610 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.434593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/675389d4-0616-4e3c-8d9d-a1d6f5247035-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.436073 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.436055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-stats-auth\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.436521 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.436501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-default-certificate\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.441512 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.441495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" Apr 16 23:52:16.443984 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.443957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95b56\" (UniqueName: \"kubernetes.io/projected/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-kube-api-access-95b56\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.444195 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.444177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7q6t\" (UniqueName: \"kubernetes.io/projected/675389d4-0616-4e3c-8d9d-a1d6f5247035-kube-api-access-l7q6t\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.535004 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.534968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bskdv\" (UniqueName: \"kubernetes.io/projected/63de64d5-ece3-4665-8f9c-1e5bc54f3018-kube-api-access-bskdv\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.535116 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.535055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63de64d5-ece3-4665-8f9c-1e5bc54f3018-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.535246 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.535215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63de64d5-ece3-4665-8f9c-1e5bc54f3018-config\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.535698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.535680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63de64d5-ece3-4665-8f9c-1e5bc54f3018-config\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.537129 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.537110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63de64d5-ece3-4665-8f9c-1e5bc54f3018-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.542484 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.542462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bskdv\" (UniqueName: \"kubernetes.io/projected/63de64d5-ece3-4665-8f9c-1e5bc54f3018-kube-api-access-bskdv\") pod \"service-ca-operator-d6fc45fc5-zclxh\" (UID: \"63de64d5-ece3-4665-8f9c-1e5bc54f3018\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.546982 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.546962 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z"] Apr 16 23:52:16.549959 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:16.549936 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod222e7080_b3ea_4699_8dc0_7f118d6c305f.slice/crio-49689304f90f71f0ecd47df3a6d9a8454cc5650c996c969d4600b45abe3e0505 WatchSource:0}: Error finding container 49689304f90f71f0ecd47df3a6d9a8454cc5650c996c969d4600b45abe3e0505: Status 404 returned error can't find the container with id 49689304f90f71f0ecd47df3a6d9a8454cc5650c996c969d4600b45abe3e0505 Apr 16 23:52:16.644643 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.644619 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" Apr 16 23:52:16.751557 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.751518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh"] Apr 16 23:52:16.755228 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:16.755204 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63de64d5_ece3_4665_8f9c_1e5bc54f3018.slice/crio-acb18b25d4b80fe241c52992652c590c28eb896df9a2cde265e35a2ae7458873 WatchSource:0}: Error finding container acb18b25d4b80fe241c52992652c590c28eb896df9a2cde265e35a2ae7458873: Status 404 returned error can't find the container with id acb18b25d4b80fe241c52992652c590c28eb896df9a2cde265e35a2ae7458873 Apr 16 23:52:16.937553 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.937514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:16.937664 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.937578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.937664 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:16.937606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:16.937664 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.937651 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:16.937833 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.937663 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:52:16.937833 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.937709 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls podName:675389d4-0616-4e3c-8d9d-a1d6f5247035 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:17.937692776 +0000 UTC m=+118.712717641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v78q2" (UID: "675389d4-0616-4e3c-8d9d-a1d6f5247035") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:16.937833 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.937725 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:17.937718781 +0000 UTC m=+118.712743645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : secret "router-metrics-certs-default" not found Apr 16 23:52:16.937833 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:16.937736 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:17.937730906 +0000 UTC m=+118.712755770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : configmap references non-existent config key: service-ca.crt Apr 16 23:52:17.150610 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.150579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" event={"ID":"222e7080-b3ea-4699-8dc0-7f118d6c305f","Type":"ContainerStarted","Data":"49689304f90f71f0ecd47df3a6d9a8454cc5650c996c969d4600b45abe3e0505"} Apr 16 23:52:17.151732 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.151695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" event={"ID":"63de64d5-ece3-4665-8f9c-1e5bc54f3018","Type":"ContainerStarted","Data":"acb18b25d4b80fe241c52992652c590c28eb896df9a2cde265e35a2ae7458873"} Apr 16 23:52:17.890844 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.890808 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt"] Apr 16 23:52:17.892778 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.892755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" Apr 16 23:52:17.894964 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.894944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lk6g9\"" Apr 16 23:52:17.899959 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.899935 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt"] Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.944119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.944195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:17.944240 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:17.944304 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls podName:675389d4-0616-4e3c-8d9d-a1d6f5247035 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:19.944284071 +0000 UTC m=+120.719308938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v78q2" (UID: "675389d4-0616-4e3c-8d9d-a1d6f5247035") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:17.944245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:17.944342 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:19.944326188 +0000 UTC m=+120.719351068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : configmap references non-existent config key: service-ca.crt Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:17.944410 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:52:17.944451 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:17.944449 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:19.944436153 +0000 UTC m=+120.719461038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : secret "router-metrics-certs-default" not found Apr 16 23:52:18.044875 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.044845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtg4\" (UniqueName: \"kubernetes.io/projected/fb08396b-4c9e-4eca-b73b-579e76eaebc7-kube-api-access-pwtg4\") pod \"network-check-source-8894fc9bd-hz2rt\" (UID: \"fb08396b-4c9e-4eca-b73b-579e76eaebc7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" Apr 16 23:52:18.145779 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.145702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtg4\" (UniqueName: \"kubernetes.io/projected/fb08396b-4c9e-4eca-b73b-579e76eaebc7-kube-api-access-pwtg4\") pod \"network-check-source-8894fc9bd-hz2rt\" (UID: \"fb08396b-4c9e-4eca-b73b-579e76eaebc7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" Apr 16 23:52:18.154301 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.154272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtg4\" (UniqueName: \"kubernetes.io/projected/fb08396b-4c9e-4eca-b73b-579e76eaebc7-kube-api-access-pwtg4\") pod \"network-check-source-8894fc9bd-hz2rt\" (UID: \"fb08396b-4c9e-4eca-b73b-579e76eaebc7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" Apr 16 23:52:18.154983 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.154951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" event={"ID":"222e7080-b3ea-4699-8dc0-7f118d6c305f","Type":"ContainerStarted","Data":"591b44784ad6a26101362b3b3e9da905210e02422184c27aa27b6683d8554d0d"} Apr 16 23:52:18.169384 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.169338 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w8z9z" podStartSLOduration=0.793630373 podStartE2EDuration="2.169323389s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:16.551642047 +0000 UTC m=+117.326666911" lastFinishedPulling="2026-04-16 23:52:17.927335046 +0000 UTC m=+118.702359927" observedRunningTime="2026-04-16 23:52:18.16796738 +0000 UTC m=+118.942992266" watchObservedRunningTime="2026-04-16 23:52:18.169323389 +0000 UTC m=+118.944348277" Apr 16 23:52:18.204238 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.204213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" Apr 16 23:52:18.671272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:18.671179 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt"] Apr 16 23:52:18.676800 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:18.676773 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb08396b_4c9e_4eca_b73b_579e76eaebc7.slice/crio-f4a5f16081849aa5ab45ff20d3790165686171efde6ad8188ad9bbc16d1cdb81 WatchSource:0}: Error finding container f4a5f16081849aa5ab45ff20d3790165686171efde6ad8188ad9bbc16d1cdb81: Status 404 returned error can't find the container with id f4a5f16081849aa5ab45ff20d3790165686171efde6ad8188ad9bbc16d1cdb81 Apr 16 23:52:19.158874 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.158834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" event={"ID":"63de64d5-ece3-4665-8f9c-1e5bc54f3018","Type":"ContainerStarted","Data":"e50aee50d1b11fe98d0b02060f40902ac495e8c8f25cd2ca71e24c4b2eef6801"} Apr 16 23:52:19.160263 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.160235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" event={"ID":"fb08396b-4c9e-4eca-b73b-579e76eaebc7","Type":"ContainerStarted","Data":"fabbf8899584a14d641874ad5746ef9e62130b35ca900f084b3c087bd7c917ef"} Apr 16 23:52:19.160376 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.160269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" event={"ID":"fb08396b-4c9e-4eca-b73b-579e76eaebc7","Type":"ContainerStarted","Data":"f4a5f16081849aa5ab45ff20d3790165686171efde6ad8188ad9bbc16d1cdb81"} Apr 16 23:52:19.172623 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.172576 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" podStartSLOduration=1.3418858710000001 podStartE2EDuration="3.17256166s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:16.757026435 +0000 UTC m=+117.532051298" lastFinishedPulling="2026-04-16 23:52:18.587702219 +0000 UTC m=+119.362727087" observedRunningTime="2026-04-16 23:52:19.172036119 +0000 UTC m=+119.947061006" watchObservedRunningTime="2026-04-16 23:52:19.17256166 +0000 UTC m=+119.947586539" Apr 16 23:52:19.184380 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.184334 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hz2rt" podStartSLOduration=2.1843193579999998 podStartE2EDuration="2.184319358s" podCreationTimestamp="2026-04-16 23:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:19.184260122 +0000 UTC m=+119.959285009" watchObservedRunningTime="2026-04-16 23:52:19.184319358 +0000 UTC m=+119.959344245" Apr 16 23:52:19.962059 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.962020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:19.962273 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.962110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:19.962273 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:19.962153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:19.962273 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:19.962171 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:19.962273 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:19.962237 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls podName:675389d4-0616-4e3c-8d9d-a1d6f5247035 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:23.962222008 +0000 UTC m=+124.737246873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v78q2" (UID: "675389d4-0616-4e3c-8d9d-a1d6f5247035") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:19.962273 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:19.962260 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:52:19.962464 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:19.962283 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:23.962268672 +0000 UTC m=+124.737293548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : configmap references non-existent config key: service-ca.crt Apr 16 23:52:19.962464 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:19.962317 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:23.962305825 +0000 UTC m=+124.737330690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : secret "router-metrics-certs-default" not found Apr 16 23:52:22.648417 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.648381 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zpl4v"] Apr 16 23:52:22.650551 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.650520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.652773 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.652745 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 23:52:22.652892 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.652744 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 23:52:22.652892 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.652804 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 23:52:22.653958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.653941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 23:52:22.653958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.653963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qg9q7\"" Apr 16 23:52:22.657890 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.657870 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zpl4v"] Apr 16 23:52:22.752642 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.752613 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xjgf6_b3dc2fa1-d372-4847-9980-2930ef815461/dns-node-resolver/0.log" Apr 16 23:52:22.782953 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.782922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qm6n\" (UniqueName: \"kubernetes.io/projected/1cf47c2c-b266-4d56-8276-adcf3036101e-kube-api-access-9qm6n\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.783072 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.782967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-key\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.783072 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.783019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-cabundle\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.883730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.883693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qm6n\" (UniqueName: \"kubernetes.io/projected/1cf47c2c-b266-4d56-8276-adcf3036101e-kube-api-access-9qm6n\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.883888 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.883753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-key\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.883888 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.883817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-cabundle\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.884466 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.884443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-cabundle\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.886105 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.886083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1cf47c2c-b266-4d56-8276-adcf3036101e-signing-key\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.891158 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.891137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qm6n\" (UniqueName: \"kubernetes.io/projected/1cf47c2c-b266-4d56-8276-adcf3036101e-kube-api-access-9qm6n\") pod \"service-ca-865cb79987-zpl4v\" (UID: \"1cf47c2c-b266-4d56-8276-adcf3036101e\") " pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:22.959478 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:22.959419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zpl4v" Apr 16 23:52:23.066115 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.066087 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zpl4v"] Apr 16 23:52:23.068920 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:23.068893 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf47c2c_b266_4d56_8276_adcf3036101e.slice/crio-3a5b5798adb579951f795bf50394cfe9b117a648f3defa61191007223cefde3a WatchSource:0}: Error finding container 3a5b5798adb579951f795bf50394cfe9b117a648f3defa61191007223cefde3a: Status 404 returned error can't find the container with id 3a5b5798adb579951f795bf50394cfe9b117a648f3defa61191007223cefde3a Apr 16 23:52:23.171740 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.171712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zpl4v" event={"ID":"1cf47c2c-b266-4d56-8276-adcf3036101e","Type":"ContainerStarted","Data":"5d50c6ca8121a56ebe950031b0d274efad2b7d68fefcdf60fcdc75586792f741"} Apr 16 23:52:23.171853 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.171749 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zpl4v" event={"ID":"1cf47c2c-b266-4d56-8276-adcf3036101e","Type":"ContainerStarted","Data":"3a5b5798adb579951f795bf50394cfe9b117a648f3defa61191007223cefde3a"} Apr 16 23:52:23.189123 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.189078 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zpl4v" podStartSLOduration=1.189061055 podStartE2EDuration="1.189061055s" podCreationTimestamp="2026-04-16 23:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:23.187412054 +0000 UTC m=+123.962436939" watchObservedRunningTime="2026-04-16 23:52:23.189061055 +0000 UTC m=+123.964085942" Apr 16 23:52:23.551916 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.551891 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxjsh_4ad19b4d-6e37-4bb2-adb6-743cb3d95223/node-ca/0.log" Apr 16 23:52:23.992770 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.992686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:23.992770 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.992755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:23.992798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:23.992824 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:23.992882 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls podName:675389d4-0616-4e3c-8d9d-a1d6f5247035 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:31.992867249 +0000 UTC m=+132.767892113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v78q2" (UID: "675389d4-0616-4e3c-8d9d-a1d6f5247035") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:23.992895 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:23.992951 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:31.992939236 +0000 UTC m=+132.767964101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : secret "router-metrics-certs-default" not found Apr 16 23:52:23.993194 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:23.992968 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle podName:df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c nodeName:}" failed. No retries permitted until 2026-04-16 23:52:31.992961198 +0000 UTC m=+132.767986063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle") pod "router-default-69ddbcffcb-4q64j" (UID: "df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c") : configmap references non-existent config key: service-ca.crt Apr 16 23:52:29.536723 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:29.536682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:52:29.537118 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:29.536827 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:52:29.537118 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:29.536900 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs podName:bac2109e-d2f6-42aa-94c6-73a79a2012f0 nodeName:}" failed. No retries permitted until 2026-04-16 23:54:31.536883696 +0000 UTC m=+252.311908561 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs") pod "network-metrics-daemon-4sczf" (UID: "bac2109e-d2f6-42aa-94c6-73a79a2012f0") : secret "metrics-daemon-secret" not found Apr 16 23:52:32.055273 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.055223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:32.055776 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.055300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:32.055776 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.055370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:32.056034 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.056007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-service-ca-bundle\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:32.057473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.057451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c-metrics-certs\") pod \"router-default-69ddbcffcb-4q64j\" (UID: \"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c\") " pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:32.057700 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.057681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/675389d4-0616-4e3c-8d9d-a1d6f5247035-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v78q2\" (UID: \"675389d4-0616-4e3c-8d9d-a1d6f5247035\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:32.150640 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.150618 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cltfn\"" Apr 16 23:52:32.157602 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.157584 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bpg9j\"" Apr 16 23:52:32.158594 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.158579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" Apr 16 23:52:32.166650 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.166627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:32.287306 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.287210 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2"] Apr 16 23:52:32.291824 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:32.291797 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675389d4_0616_4e3c_8d9d_a1d6f5247035.slice/crio-9ca66a2e4176ed8688ea7ba71662cf1c43a05a81d3ca2545a3e508b85064d078 WatchSource:0}: Error finding container 9ca66a2e4176ed8688ea7ba71662cf1c43a05a81d3ca2545a3e508b85064d078: Status 404 returned error can't find the container with id 9ca66a2e4176ed8688ea7ba71662cf1c43a05a81d3ca2545a3e508b85064d078 Apr 16 23:52:32.302897 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:32.302832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69ddbcffcb-4q64j"] Apr 16 23:52:32.304987 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:32.304965 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf3cf5af_f3bd_4f12_a5b0_d3f03341ef5c.slice/crio-4d5fc913f7a988815b785682871777d54ec5896527933189bd8a9335070ff1b7 WatchSource:0}: Error finding container 4d5fc913f7a988815b785682871777d54ec5896527933189bd8a9335070ff1b7: Status 404 returned error can't find the container with id 4d5fc913f7a988815b785682871777d54ec5896527933189bd8a9335070ff1b7 Apr 16 23:52:33.195490 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:33.195405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" event={"ID":"675389d4-0616-4e3c-8d9d-a1d6f5247035","Type":"ContainerStarted","Data":"9ca66a2e4176ed8688ea7ba71662cf1c43a05a81d3ca2545a3e508b85064d078"} Apr 16 23:52:33.196876 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:33.196850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" event={"ID":"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c","Type":"ContainerStarted","Data":"373d1319e5e61994c9cfcba9bc8914e5d420d35386afbdc7a550593be63020c2"} Apr 16 23:52:33.197015 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:33.196882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" event={"ID":"df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c","Type":"ContainerStarted","Data":"4d5fc913f7a988815b785682871777d54ec5896527933189bd8a9335070ff1b7"} Apr 16 23:52:33.214780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:33.214739 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" podStartSLOduration=17.214726464 podStartE2EDuration="17.214726464s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:33.213222474 +0000 UTC m=+133.988247369" watchObservedRunningTime="2026-04-16 23:52:33.214726464 +0000 UTC m=+133.989751353" Apr 16 23:52:34.167619 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.167576 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:34.169367 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.169345 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:34.200532 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.200500 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" event={"ID":"675389d4-0616-4e3c-8d9d-a1d6f5247035","Type":"ContainerStarted","Data":"292f1a03ce5de11e22fd8fcd7b412be71219faf3eda30e0d2463af5e122a830e"} Apr 16 23:52:34.200960 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.200715 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:34.201888 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.201864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69ddbcffcb-4q64j" Apr 16 23:52:34.215145 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:34.215096 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v78q2" podStartSLOduration=16.459656814 podStartE2EDuration="18.215081625s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:32.293858234 +0000 UTC m=+133.068883099" lastFinishedPulling="2026-04-16 23:52:34.049283033 +0000 UTC m=+134.824307910" observedRunningTime="2026-04-16 23:52:34.213733626 +0000 UTC m=+134.988758512" watchObservedRunningTime="2026-04-16 23:52:34.215081625 +0000 UTC m=+134.990106512" Apr 16 23:52:44.980804 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.980772 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c8z7v"] Apr 16 23:52:44.986623 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.986595 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:44.988891 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.988865 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:52:44.989034 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.989017 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:52:44.990118 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.990093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:52:44.990249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.990135 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:52:44.990249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.990166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kxrl6\"" Apr 16 23:52:44.993930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:44.993906 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c8z7v"] Apr 16 23:52:45.028527 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.028496 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j"] Apr 16 23:52:45.032341 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.032317 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9"] Apr 16 23:52:45.032504 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.032483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.035073 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.035049 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 23:52:45.035190 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.035145 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 23:52:45.035190 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.035163 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hbvg9\"" Apr 16 23:52:45.036282 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.036264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:45.039162 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.039139 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-xkgp7\"" Apr 16 23:52:45.039256 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.039139 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 23:52:45.042111 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.042092 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j"] Apr 16 23:52:45.043375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.043355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9"] Apr 16 23:52:45.066320 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.066297 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7748d6467c-lnj85"] Apr 16 23:52:45.069977 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.069958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.072305 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.072284 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:52:45.072374 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.072331 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nf568\"" Apr 16 23:52:45.072516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.072494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:52:45.072516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.072513 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:52:45.077008 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.076970 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:52:45.087561 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.087522 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7748d6467c-lnj85"] Apr 16 23:52:45.147478 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/86a1f3db-caf2-421b-8605-dd1617f1cc05-crio-socket\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.147629 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h77v\" (UniqueName: \"kubernetes.io/projected/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-api-access-8h77v\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.147629 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67418dd9-9c9a-4599-849e-9013809fd4d0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.147629 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/67418dd9-9c9a-4599-849e-9013809fd4d0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.147739 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/86a1f3db-caf2-421b-8605-dd1617f1cc05-data-volume\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.147739 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fe885835-2420-443f-9c28-6fae79714fb1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w4ff9\" (UID: \"fe885835-2420-443f-9c28-6fae79714fb1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:45.147739 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.147835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.147765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/86a1f3db-caf2-421b-8605-dd1617f1cc05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249084 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fe885835-2420-443f-9c28-6fae79714fb1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w4ff9\" (UID: \"fe885835-2420-443f-9c28-6fae79714fb1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/86a1f3db-caf2-421b-8605-dd1617f1cc05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-trusted-ca\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/86a1f3db-caf2-421b-8605-dd1617f1cc05-crio-socket\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h77v\" (UniqueName: \"kubernetes.io/projected/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-api-access-8h77v\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-bound-sa-token\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-installation-pull-secrets\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/86a1f3db-caf2-421b-8605-dd1617f1cc05-crio-socket\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67418dd9-9c9a-4599-849e-9013809fd4d0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-image-registry-private-configuration\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/67418dd9-9c9a-4599-849e-9013809fd4d0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24b53389-f90a-49e4-bddc-da64abb7be4d-ca-trust-extracted\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-certificates\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249565 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/86a1f3db-caf2-421b-8605-dd1617f1cc05-data-volume\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-tls\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46mp\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-kube-api-access-n46mp\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.249930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.249930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.249899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/86a1f3db-caf2-421b-8605-dd1617f1cc05-data-volume\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.250119 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.250055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/67418dd9-9c9a-4599-849e-9013809fd4d0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.251673 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.251647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/86a1f3db-caf2-421b-8605-dd1617f1cc05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.251750 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.251698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fe885835-2420-443f-9c28-6fae79714fb1-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w4ff9\" (UID: \"fe885835-2420-443f-9c28-6fae79714fb1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:45.251750 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.251730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67418dd9-9c9a-4599-849e-9013809fd4d0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwg7j\" (UID: \"67418dd9-9c9a-4599-849e-9013809fd4d0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.256402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.256380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h77v\" (UniqueName: \"kubernetes.io/projected/86a1f3db-caf2-421b-8605-dd1617f1cc05-kube-api-access-8h77v\") pod \"insights-runtime-extractor-c8z7v\" (UID: \"86a1f3db-caf2-421b-8605-dd1617f1cc05\") " pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.296611 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.296572 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c8z7v" Apr 16 23:52:45.344513 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.344442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" Apr 16 23:52:45.350446 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:45.350794 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-certificates\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.350904 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-tls\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.350904 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n46mp\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-kube-api-access-n46mp\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351011 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-trusted-ca\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351011 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.350961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-bound-sa-token\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351306 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.351009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-installation-pull-secrets\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351306 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.351041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-image-registry-private-configuration\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351306 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.351072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24b53389-f90a-49e4-bddc-da64abb7be4d-ca-trust-extracted\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.351647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.351487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24b53389-f90a-49e4-bddc-da64abb7be4d-ca-trust-extracted\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.352747 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.352456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-certificates\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.352879 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.352855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b53389-f90a-49e4-bddc-da64abb7be4d-trusted-ca\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.356030 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.355569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-registry-tls\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.356030 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.355983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-installation-pull-secrets\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.357344 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.357307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24b53389-f90a-49e4-bddc-da64abb7be4d-image-registry-private-configuration\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.362720 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.362675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46mp\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-kube-api-access-n46mp\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.362807 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.362718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b53389-f90a-49e4-bddc-da64abb7be4d-bound-sa-token\") pod \"image-registry-7748d6467c-lnj85\" (UID: \"24b53389-f90a-49e4-bddc-da64abb7be4d\") " pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.380913 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.380887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:45.458126 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.458069 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c8z7v"] Apr 16 23:52:45.510433 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.510408 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j"] Apr 16 23:52:45.519607 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:45.519580 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67418dd9_9c9a_4599_849e_9013809fd4d0.slice/crio-c6053c1063dfb9f58120acb89dba5d6e54d3155f351d205cb3306987852c1a7e WatchSource:0}: Error finding container c6053c1063dfb9f58120acb89dba5d6e54d3155f351d205cb3306987852c1a7e: Status 404 returned error can't find the container with id c6053c1063dfb9f58120acb89dba5d6e54d3155f351d205cb3306987852c1a7e Apr 16 23:52:45.528442 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.528418 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9"] Apr 16 23:52:45.531393 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:45.531372 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe885835_2420_443f_9c28_6fae79714fb1.slice/crio-6c513d89f5718ea5b7aa6273518c5c312d58d97396c2b0856b226be361d8f148 WatchSource:0}: Error finding container 6c513d89f5718ea5b7aa6273518c5c312d58d97396c2b0856b226be361d8f148: Status 404 returned error can't find the container with id 6c513d89f5718ea5b7aa6273518c5c312d58d97396c2b0856b226be361d8f148 Apr 16 23:52:45.543577 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:45.543553 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7748d6467c-lnj85"] Apr 16 23:52:45.546797 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:45.546763 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b53389_f90a_49e4_bddc_da64abb7be4d.slice/crio-e256e7247b7bbe772594803460661b93ce61e02a15dbfcbcba3acf6b8c7e42c5 WatchSource:0}: Error finding container e256e7247b7bbe772594803460661b93ce61e02a15dbfcbcba3acf6b8c7e42c5: Status 404 returned error can't find the container with id e256e7247b7bbe772594803460661b93ce61e02a15dbfcbcba3acf6b8c7e42c5 Apr 16 23:52:46.229234 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.229175 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" event={"ID":"24b53389-f90a-49e4-bddc-da64abb7be4d","Type":"ContainerStarted","Data":"2e9c44ff6734205de1222a038f3275304be7eea1f540e6ccf680f46e7957e835"} Apr 16 23:52:46.229234 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.229214 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" event={"ID":"24b53389-f90a-49e4-bddc-da64abb7be4d","Type":"ContainerStarted","Data":"e256e7247b7bbe772594803460661b93ce61e02a15dbfcbcba3acf6b8c7e42c5"} Apr 16 23:52:46.229677 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.229353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:52:46.230765 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.230731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" event={"ID":"fe885835-2420-443f-9c28-6fae79714fb1","Type":"ContainerStarted","Data":"6c513d89f5718ea5b7aa6273518c5c312d58d97396c2b0856b226be361d8f148"} Apr 16 23:52:46.232067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.232032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" event={"ID":"67418dd9-9c9a-4599-849e-9013809fd4d0","Type":"ContainerStarted","Data":"c6053c1063dfb9f58120acb89dba5d6e54d3155f351d205cb3306987852c1a7e"} Apr 16 23:52:46.233792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.233763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c8z7v" event={"ID":"86a1f3db-caf2-421b-8605-dd1617f1cc05","Type":"ContainerStarted","Data":"802b5dc81ff778ec993a56ee0642844af65e50211f2e562e16aaa31ec61bd0f7"} Apr 16 23:52:46.233792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.233798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c8z7v" event={"ID":"86a1f3db-caf2-421b-8605-dd1617f1cc05","Type":"ContainerStarted","Data":"1fcf2e6f17f3226165720fda019d1431776909c784469afd472e2400198f79cf"} Apr 16 23:52:46.249597 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:46.249530 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" podStartSLOduration=1.249519655 podStartE2EDuration="1.249519655s" podCreationTimestamp="2026-04-16 23:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:46.246933071 +0000 UTC m=+147.021957957" watchObservedRunningTime="2026-04-16 23:52:46.249519655 +0000 UTC m=+147.024544531" Apr 16 23:52:47.237967 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.237866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" event={"ID":"fe885835-2420-443f-9c28-6fae79714fb1","Type":"ContainerStarted","Data":"10e23a1564fed5f33de44b3bd90999e27b9e98d3f9015cf6f671c2070c2d4325"} Apr 16 23:52:47.238494 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.238045 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:47.239520 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.239459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" event={"ID":"67418dd9-9c9a-4599-849e-9013809fd4d0","Type":"ContainerStarted","Data":"26604f59af4c5b4df06fadeef7bc8df67c60b9f1315f26c87d4d930978dd2bd6"} Apr 16 23:52:47.242358 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.242331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c8z7v" event={"ID":"86a1f3db-caf2-421b-8605-dd1617f1cc05","Type":"ContainerStarted","Data":"8c34cfe501d1c4d1e752ce79c3b2c9a52417aba56db19345ca6c0e1f6e7b3b00"} Apr 16 23:52:47.244408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.244386 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" Apr 16 23:52:47.255462 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.254218 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w4ff9" podStartSLOduration=0.897493076 podStartE2EDuration="2.254199095s" podCreationTimestamp="2026-04-16 23:52:45 +0000 UTC" firstStartedPulling="2026-04-16 23:52:45.533274329 +0000 UTC m=+146.308299195" lastFinishedPulling="2026-04-16 23:52:46.889980347 +0000 UTC m=+147.665005214" observedRunningTime="2026-04-16 23:52:47.25188548 +0000 UTC m=+148.026910369" watchObservedRunningTime="2026-04-16 23:52:47.254199095 +0000 UTC m=+148.029223982" Apr 16 23:52:47.268418 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:47.268372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwg7j" podStartSLOduration=0.901632659 podStartE2EDuration="2.268356094s" podCreationTimestamp="2026-04-16 23:52:45 +0000 UTC" firstStartedPulling="2026-04-16 23:52:45.521529 +0000 UTC m=+146.296553867" lastFinishedPulling="2026-04-16 23:52:46.888252439 +0000 UTC m=+147.663277302" observedRunningTime="2026-04-16 23:52:47.266902558 +0000 UTC m=+148.041927444" watchObservedRunningTime="2026-04-16 23:52:47.268356094 +0000 UTC m=+148.043381022" Apr 16 23:52:48.246767 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:48.246678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c8z7v" event={"ID":"86a1f3db-caf2-421b-8605-dd1617f1cc05","Type":"ContainerStarted","Data":"44f3673d5c845be2a68ee6e76ab36bf88060239b520fc8dd24ea7fa4c69cd7b5"} Apr 16 23:52:48.265412 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:48.265366 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c8z7v" podStartSLOduration=1.96583154 podStartE2EDuration="4.26535317s" podCreationTimestamp="2026-04-16 23:52:44 +0000 UTC" firstStartedPulling="2026-04-16 23:52:45.543854858 +0000 UTC m=+146.318879729" lastFinishedPulling="2026-04-16 23:52:47.843376482 +0000 UTC m=+148.618401359" observedRunningTime="2026-04-16 23:52:48.263773851 +0000 UTC m=+149.038798736" watchObservedRunningTime="2026-04-16 23:52:48.26535317 +0000 UTC m=+149.040378055" Apr 16 23:52:51.952581 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.952533 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tjxxv"] Apr 16 23:52:51.957950 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.957930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:51.960159 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.960136 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:52:51.960552 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.960515 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:52:51.960671 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.960641 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fhzzn\"" Apr 16 23:52:51.961638 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.961618 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:52:51.961776 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:51.961662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:52:52.002690 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-root\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.002868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-sys\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.002868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.002868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-wtmp\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.002868 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-metrics-client-ca\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.003124 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-tls\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.003124 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59gh4\" (UniqueName: \"kubernetes.io/projected/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-kube-api-access-59gh4\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.003124 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.003124 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.002948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-textfile\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104036 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.103986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-sys\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-wtmp\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-sys\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-metrics-client-ca\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104411 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-wtmp\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104701 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-tls\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104768 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104768 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59gh4\" (UniqueName: \"kubernetes.io/projected/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-kube-api-access-59gh4\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104768 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-metrics-client-ca\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104924 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104924 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-textfile\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.104924 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-root\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.105068 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.104954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-root\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.105210 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.105189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-textfile\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.106875 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.106844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-tls\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.107283 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.107257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.112885 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.112859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59gh4\" (UniqueName: \"kubernetes.io/projected/22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2-kube-api-access-59gh4\") pod \"node-exporter-tjxxv\" (UID: \"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2\") " pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.268404 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.268365 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tjxxv" Apr 16 23:52:52.278427 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:52.278394 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ab6cc5_0d79_4ce3_9fee_cea0d3291ab2.slice/crio-964c15bb8fe73eac9a7d89c901eb81d30576339982c49449315e4823b249657d WatchSource:0}: Error finding container 964c15bb8fe73eac9a7d89c901eb81d30576339982c49449315e4823b249657d: Status 404 returned error can't find the container with id 964c15bb8fe73eac9a7d89c901eb81d30576339982c49449315e4823b249657d Apr 16 23:52:52.992009 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.991642 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:52:52.995501 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.995479 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:52.998021 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.997827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 23:52:52.998562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998256 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 23:52:52.998562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 23:52:52.998562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998435 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 23:52:52.998562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998495 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 23:52:52.998562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998525 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 23:52:52.998854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998746 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8s72g\"" Apr 16 23:52:52.998905 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 23:52:52.998987 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.998971 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 23:52:52.999154 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:52.999131 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 23:52:53.008188 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.008160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:52:53.013347 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013474 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013474 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013607 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013607 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013703 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013703 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013703 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6zc\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013988 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.013988 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.013871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114601 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114601 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6zc\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114818 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:53.114757 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle podName:adb0350f-8ad2-441c-b9ae-f4ed5e0f1503 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:53.614736894 +0000 UTC m=+154.389761781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503") : configmap references non-existent config key: ca-bundle.crt Apr 16 23:52:53.114818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.114989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.114984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.115020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.115044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.115086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.115091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.115135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.115992 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:53.115964 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 23:52:53.116107 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:53.116058 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls podName:adb0350f-8ad2-441c-b9ae-f4ed5e0f1503 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:53.616038726 +0000 UTC m=+154.391063596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503") : secret "alertmanager-main-tls" not found Apr 16 23:52:53.118123 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.118022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.118495 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.118473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.118920 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.118880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.119161 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.119118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.119435 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.119358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.119435 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.119420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.119691 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.119660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.120050 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.120030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.120147 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.120130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.122695 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.122671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6zc\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.260733 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.260700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tjxxv" event={"ID":"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2","Type":"ContainerStarted","Data":"55319d116241df60c9d67e9ce84b5d28dd3c1a038b6d0deb12f55fff85ab2e6d"} Apr 16 23:52:53.260874 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.260743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tjxxv" event={"ID":"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2","Type":"ContainerStarted","Data":"964c15bb8fe73eac9a7d89c901eb81d30576339982c49449315e4823b249657d"} Apr 16 23:52:53.621500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.621414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.621651 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.621603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.622226 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.622201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.623999 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.623975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:53.907147 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:53.907060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:52:54.040144 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:54.040111 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:52:54.041185 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:54.041157 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb0350f_8ad2_441c_b9ae_f4ed5e0f1503.slice/crio-196ad8259e035b553998470e28f2b2871f419a6a8f65638fb6a0a21577bd49a2 WatchSource:0}: Error finding container 196ad8259e035b553998470e28f2b2871f419a6a8f65638fb6a0a21577bd49a2: Status 404 returned error can't find the container with id 196ad8259e035b553998470e28f2b2871f419a6a8f65638fb6a0a21577bd49a2 Apr 16 23:52:54.264198 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:54.264166 2578 generic.go:358] "Generic (PLEG): container finished" podID="22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2" containerID="55319d116241df60c9d67e9ce84b5d28dd3c1a038b6d0deb12f55fff85ab2e6d" exitCode=0 Apr 16 23:52:54.264382 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:54.264239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tjxxv" event={"ID":"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2","Type":"ContainerDied","Data":"55319d116241df60c9d67e9ce84b5d28dd3c1a038b6d0deb12f55fff85ab2e6d"} Apr 16 23:52:54.265255 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:54.265235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"196ad8259e035b553998470e28f2b2871f419a6a8f65638fb6a0a21577bd49a2"} Apr 16 23:52:55.269186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:55.269157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b"} Apr 16 23:52:55.270948 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:55.270927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tjxxv" event={"ID":"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2","Type":"ContainerStarted","Data":"e6102c6abef0178548badbfac0425a8af8009eb5d07166823d8e91a268d1b09b"} Apr 16 23:52:55.271043 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:55.270958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tjxxv" event={"ID":"22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2","Type":"ContainerStarted","Data":"976f91a0277f9a7d16883b41e244543d53709047d83802822d92e78c2e0d873b"} Apr 16 23:52:55.289341 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:55.289295 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tjxxv" podStartSLOduration=3.402122083 podStartE2EDuration="4.289279929s" podCreationTimestamp="2026-04-16 23:52:51 +0000 UTC" firstStartedPulling="2026-04-16 23:52:52.280749274 +0000 UTC m=+153.055774143" lastFinishedPulling="2026-04-16 23:52:53.167907115 +0000 UTC m=+153.942931989" observedRunningTime="2026-04-16 23:52:55.287675767 +0000 UTC m=+156.062700655" watchObservedRunningTime="2026-04-16 23:52:55.289279929 +0000 UTC m=+156.064304814" Apr 16 23:52:55.641989 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:55.641943 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xcndm" podUID="2e003096-f002-43cb-9237-3811ca14f285" Apr 16 23:52:55.653242 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:55.653206 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-w4vbz" podUID="48793279-1866-40db-8e3c-e2c46e4d6f6d" Apr 16 23:52:56.275842 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.275810 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b" exitCode=0 Apr 16 23:52:56.276283 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.275898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b"} Apr 16 23:52:56.276283 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.275950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcndm" Apr 16 23:52:56.327596 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.327562 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-68ff4cfd57-zmv8h"] Apr 16 23:52:56.330742 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.330727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.333683 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.333651 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 23:52:56.333818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.333793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jjk8q\"" Apr 16 23:52:56.334379 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.334138 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-8voacf7hcpbnj\"" Apr 16 23:52:56.336035 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.334604 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 23:52:56.336035 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.334822 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 23:52:56.336035 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.335666 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:52:56.340662 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.340639 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68ff4cfd57-zmv8h"] Apr 16 23:52:56.449262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-tls\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-client-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-metrics-server-audit-profiles\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-audit-log\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449518 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-client-certs\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.449716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.449519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t96l9\" (UniqueName: \"kubernetes.io/projected/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-kube-api-access-t96l9\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.550725 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-tls\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.550725 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-client-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.550725 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-metrics-server-audit-profiles\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.550926 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-audit-log\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.550971 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.551024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.550976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-client-certs\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.551024 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.551006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t96l9\" (UniqueName: \"kubernetes.io/projected/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-kube-api-access-t96l9\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.551230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.551208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-audit-log\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.551677 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.551649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.551840 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.551821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-metrics-server-audit-profiles\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.553249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.553229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-client-ca-bundle\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.553385 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.553367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-tls\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.553438 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.553384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-secret-metrics-server-client-certs\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.559048 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.559030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t96l9\" (UniqueName: \"kubernetes.io/projected/8e5ccfea-1770-41d3-bf4c-610f04b0b7e0-kube-api-access-t96l9\") pod \"metrics-server-68ff4cfd57-zmv8h\" (UID: \"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0\") " pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.645253 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.645215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:52:56.705603 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.705566 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w"] Apr 16 23:52:56.710502 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.710480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:56.715633 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.715420 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 23:52:56.715633 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.715522 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w"] Apr 16 23:52:56.715994 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.715918 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-b5xsj\"" Apr 16 23:52:56.753299 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.753254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-55j2w\" (UID: \"1c04e984-f558-41cc-852c-fd03622e44c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:56.786443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.786411 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68ff4cfd57-zmv8h"] Apr 16 23:52:56.789770 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:56.789737 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5ccfea_1770_41d3_bf4c_610f04b0b7e0.slice/crio-ed9ea63a488d12030e553d06c057e66f9df3d563b455b86259eeeb7dbb792c05 WatchSource:0}: Error finding container ed9ea63a488d12030e553d06c057e66f9df3d563b455b86259eeeb7dbb792c05: Status 404 returned error can't find the container with id ed9ea63a488d12030e553d06c057e66f9df3d563b455b86259eeeb7dbb792c05 Apr 16 23:52:56.822447 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:56.822354 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4sczf" podUID="bac2109e-d2f6-42aa-94c6-73a79a2012f0" Apr 16 23:52:56.853842 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:56.853807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-55j2w\" (UID: \"1c04e984-f558-41cc-852c-fd03622e44c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:56.853975 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:56.853951 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 23:52:56.854032 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:52:56.854021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert podName:1c04e984-f558-41cc-852c-fd03622e44c3 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:57.354004557 +0000 UTC m=+158.129029421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-55j2w" (UID: "1c04e984-f558-41cc-852c-fd03622e44c3") : secret "monitoring-plugin-cert" not found Apr 16 23:52:57.279834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:57.279802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" event={"ID":"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0","Type":"ContainerStarted","Data":"ed9ea63a488d12030e553d06c057e66f9df3d563b455b86259eeeb7dbb792c05"} Apr 16 23:52:57.358783 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:57.358741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-55j2w\" (UID: \"1c04e984-f558-41cc-852c-fd03622e44c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:57.361153 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:57.361130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c04e984-f558-41cc-852c-fd03622e44c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-55j2w\" (UID: \"1c04e984-f558-41cc-852c-fd03622e44c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:57.625257 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:57.625172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:52:58.477383 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:58.477355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w"] Apr 16 23:52:58.479971 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:52:58.479944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c04e984_f558_41cc_852c_fd03622e44c3.slice/crio-59f123c1610185e6e3d973723e90ec31e11c11a8d7a7e79a824cd027875948c1 WatchSource:0}: Error finding container 59f123c1610185e6e3d973723e90ec31e11c11a8d7a7e79a824cd027875948c1: Status 404 returned error can't find the container with id 59f123c1610185e6e3d973723e90ec31e11c11a8d7a7e79a824cd027875948c1 Apr 16 23:52:59.287323 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.287262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" event={"ID":"8e5ccfea-1770-41d3-bf4c-610f04b0b7e0","Type":"ContainerStarted","Data":"eda05c45bdc095b0817a08be1e31edb86103b5a6f3b81d4f6e3ac1d5a1838194"} Apr 16 23:52:59.289013 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.288975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" event={"ID":"1c04e984-f558-41cc-852c-fd03622e44c3","Type":"ContainerStarted","Data":"59f123c1610185e6e3d973723e90ec31e11c11a8d7a7e79a824cd027875948c1"} Apr 16 23:52:59.292408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.292373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226"} Apr 16 23:52:59.292564 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.292412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f"} Apr 16 23:52:59.292564 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.292427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d"} Apr 16 23:52:59.292564 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.292440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f"} Apr 16 23:52:59.292564 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.292453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292"} Apr 16 23:52:59.305026 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:52:59.304951 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" podStartSLOduration=1.742879122 podStartE2EDuration="3.304931672s" podCreationTimestamp="2026-04-16 23:52:56 +0000 UTC" firstStartedPulling="2026-04-16 23:52:56.791704071 +0000 UTC m=+157.566728934" lastFinishedPulling="2026-04-16 23:52:58.353756601 +0000 UTC m=+159.128781484" observedRunningTime="2026-04-16 23:52:59.303236774 +0000 UTC m=+160.078261661" watchObservedRunningTime="2026-04-16 23:52:59.304931672 +0000 UTC m=+160.079956559" Apr 16 23:53:00.298000 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.297964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerStarted","Data":"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511"} Apr 16 23:53:00.299272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.299243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" event={"ID":"1c04e984-f558-41cc-852c-fd03622e44c3","Type":"ContainerStarted","Data":"2b39a2f94a5f8c9fe85a57938473a845c0bce21983827c9a8a6c7fe7675654f4"} Apr 16 23:53:00.299553 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.299518 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:53:00.304498 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.304479 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" Apr 16 23:53:00.325260 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.325214 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.661549836 podStartE2EDuration="8.325198583s" podCreationTimestamp="2026-04-16 23:52:52 +0000 UTC" firstStartedPulling="2026-04-16 23:52:54.043063863 +0000 UTC m=+154.818088726" lastFinishedPulling="2026-04-16 23:52:59.70671261 +0000 UTC m=+160.481737473" observedRunningTime="2026-04-16 23:53:00.322918451 +0000 UTC m=+161.097943338" watchObservedRunningTime="2026-04-16 23:53:00.325198583 +0000 UTC m=+161.100223525" Apr 16 23:53:00.338884 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.338800 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-55j2w" podStartSLOduration=3.071161093 podStartE2EDuration="4.338780566s" podCreationTimestamp="2026-04-16 23:52:56 +0000 UTC" firstStartedPulling="2026-04-16 23:52:58.481787592 +0000 UTC m=+159.256812456" lastFinishedPulling="2026-04-16 23:52:59.749407062 +0000 UTC m=+160.524431929" observedRunningTime="2026-04-16 23:53:00.337700009 +0000 UTC m=+161.112724893" watchObservedRunningTime="2026-04-16 23:53:00.338780566 +0000 UTC m=+161.113805452" Apr 16 23:53:00.591324 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.591219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:53:00.591324 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.591265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:53:00.593698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.593671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e003096-f002-43cb-9237-3811ca14f285-metrics-tls\") pod \"dns-default-xcndm\" (UID: \"2e003096-f002-43cb-9237-3811ca14f285\") " pod="openshift-dns/dns-default-xcndm" Apr 16 23:53:00.593800 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.593707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48793279-1866-40db-8e3c-e2c46e4d6f6d-cert\") pod \"ingress-canary-w4vbz\" (UID: \"48793279-1866-40db-8e3c-e2c46e4d6f6d\") " pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:53:00.779630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.779595 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:53:00.787613 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.787591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcndm" Apr 16 23:53:00.902308 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:00.902273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcndm"] Apr 16 23:53:00.906732 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:53:00.906700 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e003096_f002_43cb_9237_3811ca14f285.slice/crio-791247bdeeb6cc93f38a907f04384215ebb3531b3d991e2e8bc4923dde9a2940 WatchSource:0}: Error finding container 791247bdeeb6cc93f38a907f04384215ebb3531b3d991e2e8bc4923dde9a2940: Status 404 returned error can't find the container with id 791247bdeeb6cc93f38a907f04384215ebb3531b3d991e2e8bc4923dde9a2940 Apr 16 23:53:01.308819 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:01.308779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcndm" event={"ID":"2e003096-f002-43cb-9237-3811ca14f285","Type":"ContainerStarted","Data":"791247bdeeb6cc93f38a907f04384215ebb3531b3d991e2e8bc4923dde9a2940"} Apr 16 23:53:02.313916 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:02.313842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcndm" event={"ID":"2e003096-f002-43cb-9237-3811ca14f285","Type":"ContainerStarted","Data":"6e12aa4964904714a43ffd45a805a1124e76f4701962dac3077cfbc558a3e98b"} Apr 16 23:53:02.313916 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:02.313878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcndm" event={"ID":"2e003096-f002-43cb-9237-3811ca14f285","Type":"ContainerStarted","Data":"d8d4fb8d3a66563aba902d0548dd848669d2eec44320ba22145c2c7c9f7dea08"} Apr 16 23:53:02.314268 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:02.314014 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xcndm" Apr 16 23:53:02.329167 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:02.329118 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xcndm" podStartSLOduration=129.172895577 podStartE2EDuration="2m10.329102672s" podCreationTimestamp="2026-04-16 23:50:52 +0000 UTC" firstStartedPulling="2026-04-16 23:53:00.90873304 +0000 UTC m=+161.683757912" lastFinishedPulling="2026-04-16 23:53:02.064940142 +0000 UTC m=+162.839965007" observedRunningTime="2026-04-16 23:53:02.328123947 +0000 UTC m=+163.103148832" watchObservedRunningTime="2026-04-16 23:53:02.329102672 +0000 UTC m=+163.104127558" Apr 16 23:53:05.384842 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:05.384808 2578 patch_prober.go:28] interesting pod/image-registry-7748d6467c-lnj85 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:53:05.385215 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:05.384859 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" podUID="24b53389-f90a-49e4-bddc-da64abb7be4d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:53:07.246001 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.245964 2578 patch_prober.go:28] interesting pod/image-registry-7748d6467c-lnj85 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:53:07.246365 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.246026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" podUID="24b53389-f90a-49e4-bddc-da64abb7be4d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:53:07.813554 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.813499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:53:07.816448 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.816427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:53:07.824522 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.824505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w4vbz" Apr 16 23:53:07.940286 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:07.940264 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w4vbz"] Apr 16 23:53:07.942743 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:53:07.942715 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48793279_1866_40db_8e3c_e2c46e4d6f6d.slice/crio-7cd20dc7dfc3cb90d531dbcda1f1410d2d42fb4bcb9dc59940b15a225fe2c7cd WatchSource:0}: Error finding container 7cd20dc7dfc3cb90d531dbcda1f1410d2d42fb4bcb9dc59940b15a225fe2c7cd: Status 404 returned error can't find the container with id 7cd20dc7dfc3cb90d531dbcda1f1410d2d42fb4bcb9dc59940b15a225fe2c7cd Apr 16 23:53:08.333033 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:08.332995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w4vbz" event={"ID":"48793279-1866-40db-8e3c-e2c46e4d6f6d","Type":"ContainerStarted","Data":"7cd20dc7dfc3cb90d531dbcda1f1410d2d42fb4bcb9dc59940b15a225fe2c7cd"} Apr 16 23:53:10.341659 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:10.341612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w4vbz" event={"ID":"48793279-1866-40db-8e3c-e2c46e4d6f6d","Type":"ContainerStarted","Data":"45ebfa5ea92cd5fbb4f76da263514bd79485f50580fea803a053a44481db6324"} Apr 16 23:53:10.357083 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:10.357033 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w4vbz" podStartSLOduration=136.844641446 podStartE2EDuration="2m18.357020178s" podCreationTimestamp="2026-04-16 23:50:52 +0000 UTC" firstStartedPulling="2026-04-16 23:53:07.944709617 +0000 UTC m=+168.719734480" lastFinishedPulling="2026-04-16 23:53:09.457088332 +0000 UTC m=+170.232113212" observedRunningTime="2026-04-16 23:53:10.354943615 +0000 UTC m=+171.129968490" watchObservedRunningTime="2026-04-16 23:53:10.357020178 +0000 UTC m=+171.132045063" Apr 16 23:53:11.813783 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:11.813743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:53:12.319129 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:12.319093 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xcndm" Apr 16 23:53:15.083734 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.083699 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-88kbv"] Apr 16 23:53:15.086992 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.086976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:15.089496 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.089470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mtkjh\"" Apr 16 23:53:15.089496 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.089484 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 23:53:15.089656 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.089520 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 23:53:15.096106 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.096086 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-88kbv"] Apr 16 23:53:15.222737 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.222701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgp2\" (UniqueName: \"kubernetes.io/projected/e3a85d1e-0965-4d09-95ed-fe07834583c7-kube-api-access-bzgp2\") pod \"downloads-6bcc868b7-88kbv\" (UID: \"e3a85d1e-0965-4d09-95ed-fe07834583c7\") " pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:15.323381 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.323354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgp2\" (UniqueName: \"kubernetes.io/projected/e3a85d1e-0965-4d09-95ed-fe07834583c7-kube-api-access-bzgp2\") pod \"downloads-6bcc868b7-88kbv\" (UID: \"e3a85d1e-0965-4d09-95ed-fe07834583c7\") " pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:15.330854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.330824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgp2\" (UniqueName: \"kubernetes.io/projected/e3a85d1e-0965-4d09-95ed-fe07834583c7-kube-api-access-bzgp2\") pod \"downloads-6bcc868b7-88kbv\" (UID: \"e3a85d1e-0965-4d09-95ed-fe07834583c7\") " pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:15.384897 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.384833 2578 patch_prober.go:28] interesting pod/image-registry-7748d6467c-lnj85 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:53:15.384897 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.384874 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" podUID="24b53389-f90a-49e4-bddc-da64abb7be4d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:53:15.397321 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.397303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:15.510313 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:15.510163 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-88kbv"] Apr 16 23:53:15.513025 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:53:15.512993 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a85d1e_0965_4d09_95ed_fe07834583c7.slice/crio-36fadc5a964bcda9b64555c37a50c55bd8dbef7dac52e57fefd53847ffef10fd WatchSource:0}: Error finding container 36fadc5a964bcda9b64555c37a50c55bd8dbef7dac52e57fefd53847ffef10fd: Status 404 returned error can't find the container with id 36fadc5a964bcda9b64555c37a50c55bd8dbef7dac52e57fefd53847ffef10fd Apr 16 23:53:16.361531 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:16.361490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-88kbv" event={"ID":"e3a85d1e-0965-4d09-95ed-fe07834583c7","Type":"ContainerStarted","Data":"36fadc5a964bcda9b64555c37a50c55bd8dbef7dac52e57fefd53847ffef10fd"} Apr 16 23:53:16.645694 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:16.645619 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:53:16.645694 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:16.645667 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:53:17.246527 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:17.246499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7748d6467c-lnj85" Apr 16 23:53:21.028205 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.028157 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:53:21.034650 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.034625 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.037139 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.037115 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 23:53:21.037243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.037209 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 23:53:21.037385 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.037364 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 23:53:21.037465 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.037432 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 23:53:21.039487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.039117 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 23:53:21.039742 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.039661 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8qz7b\"" Apr 16 23:53:21.040843 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.040821 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:53:21.174460 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.174663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.174663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.174785 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.174785 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.174785 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.174732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mq2\" (UniqueName: \"kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.275865 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.275834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.275890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.275930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.275958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.275975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.276005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mq2\" (UniqueName: \"kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276814 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.276781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276924 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.276793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.276991 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.276923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.279064 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.279009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.279171 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.279060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.284708 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.284681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mq2\" (UniqueName: \"kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2\") pod \"console-6cb9d4d657-whlnw\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.346909 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.346885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:21.479272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:21.479244 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:53:21.482804 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:53:21.482776 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8d9d31_0e6c_47ee_829f_b6cad151a3bf.slice/crio-9ea6e316927e83951516a40a95acc84531fa26f827e717932c2ab2cb239ff9b7 WatchSource:0}: Error finding container 9ea6e316927e83951516a40a95acc84531fa26f827e717932c2ab2cb239ff9b7: Status 404 returned error can't find the container with id 9ea6e316927e83951516a40a95acc84531fa26f827e717932c2ab2cb239ff9b7 Apr 16 23:53:22.379756 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:22.379708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb9d4d657-whlnw" event={"ID":"da8d9d31-0e6c-47ee-829f-b6cad151a3bf","Type":"ContainerStarted","Data":"9ea6e316927e83951516a40a95acc84531fa26f827e717932c2ab2cb239ff9b7"} Apr 16 23:53:31.176378 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.176345 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:53:31.181012 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.180988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.189824 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.189623 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 23:53:31.191266 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.191243 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:53:31.262985 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.262950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263130 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.262989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263130 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.263012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263130 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.263105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.263165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.263183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.263262 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.263209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqtl9\" (UniqueName: \"kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364513 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364801 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364801 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqtl9\" (UniqueName: \"kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364801 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364964 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364823 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.364964 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.364858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.365415 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.365386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.366017 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.365989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.366157 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.366078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.366392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.366370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.367368 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.367344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.367943 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.367920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.372349 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.372330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqtl9\" (UniqueName: \"kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9\") pod \"console-6f8bb65b6b-s8dcp\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.407713 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.407689 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb9d4d657-whlnw" event={"ID":"da8d9d31-0e6c-47ee-829f-b6cad151a3bf","Type":"ContainerStarted","Data":"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1"} Apr 16 23:53:31.425766 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.425714 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cb9d4d657-whlnw" podStartSLOduration=0.605755174 podStartE2EDuration="10.425696661s" podCreationTimestamp="2026-04-16 23:53:21 +0000 UTC" firstStartedPulling="2026-04-16 23:53:21.485160897 +0000 UTC m=+182.260185767" lastFinishedPulling="2026-04-16 23:53:31.305102386 +0000 UTC m=+192.080127254" observedRunningTime="2026-04-16 23:53:31.423749713 +0000 UTC m=+192.198774601" watchObservedRunningTime="2026-04-16 23:53:31.425696661 +0000 UTC m=+192.200721548" Apr 16 23:53:31.493517 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.493448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:31.610701 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:31.610671 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:53:31.613384 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:53:31.613355 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c715ee4_b5f2_476e_9215_7d90e925a5a7.slice/crio-1e087635c74dfa3bfe74d6258b06b3a7991da4e2232507c6b0caae9f66ff64a6 WatchSource:0}: Error finding container 1e087635c74dfa3bfe74d6258b06b3a7991da4e2232507c6b0caae9f66ff64a6: Status 404 returned error can't find the container with id 1e087635c74dfa3bfe74d6258b06b3a7991da4e2232507c6b0caae9f66ff64a6 Apr 16 23:53:32.413293 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.413254 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-88kbv" event={"ID":"e3a85d1e-0965-4d09-95ed-fe07834583c7","Type":"ContainerStarted","Data":"4de96424946c36641db6fde39596b0b1e8a9d2e33707336d9f856094f797f184"} Apr 16 23:53:32.413774 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.413396 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:32.415394 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.415365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8bb65b6b-s8dcp" event={"ID":"8c715ee4-b5f2-476e-9215-7d90e925a5a7","Type":"ContainerStarted","Data":"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3"} Apr 16 23:53:32.415496 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.415397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8bb65b6b-s8dcp" event={"ID":"8c715ee4-b5f2-476e-9215-7d90e925a5a7","Type":"ContainerStarted","Data":"1e087635c74dfa3bfe74d6258b06b3a7991da4e2232507c6b0caae9f66ff64a6"} Apr 16 23:53:32.430680 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.430634 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-88kbv" podStartSLOduration=1.601741524 podStartE2EDuration="17.43062292s" podCreationTimestamp="2026-04-16 23:53:15 +0000 UTC" firstStartedPulling="2026-04-16 23:53:15.514891938 +0000 UTC m=+176.289916802" lastFinishedPulling="2026-04-16 23:53:31.343773335 +0000 UTC m=+192.118798198" observedRunningTime="2026-04-16 23:53:32.428945164 +0000 UTC m=+193.203970054" watchObservedRunningTime="2026-04-16 23:53:32.43062292 +0000 UTC m=+193.205647805" Apr 16 23:53:32.442008 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.441986 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-88kbv" Apr 16 23:53:32.448740 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:32.448691 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f8bb65b6b-s8dcp" podStartSLOduration=1.44867767 podStartE2EDuration="1.44867767s" podCreationTimestamp="2026-04-16 23:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:53:32.447195706 +0000 UTC m=+193.222220595" watchObservedRunningTime="2026-04-16 23:53:32.44867767 +0000 UTC m=+193.223702556" Apr 16 23:53:36.651672 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:36.651638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:53:36.655963 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:36.655934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-68ff4cfd57-zmv8h" Apr 16 23:53:40.441947 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:40.441902 2578 generic.go:358] "Generic (PLEG): container finished" podID="63de64d5-ece3-4665-8f9c-1e5bc54f3018" containerID="e50aee50d1b11fe98d0b02060f40902ac495e8c8f25cd2ca71e24c4b2eef6801" exitCode=0 Apr 16 23:53:40.441947 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:40.441938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" event={"ID":"63de64d5-ece3-4665-8f9c-1e5bc54f3018","Type":"ContainerDied","Data":"e50aee50d1b11fe98d0b02060f40902ac495e8c8f25cd2ca71e24c4b2eef6801"} Apr 16 23:53:40.442526 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:40.442306 2578 scope.go:117] "RemoveContainer" containerID="e50aee50d1b11fe98d0b02060f40902ac495e8c8f25cd2ca71e24c4b2eef6801" Apr 16 23:53:41.347381 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.347346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:41.347381 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.347387 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:41.351608 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.351587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:41.446476 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.446445 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zclxh" event={"ID":"63de64d5-ece3-4665-8f9c-1e5bc54f3018","Type":"ContainerStarted","Data":"b27cc9bd77bca49af9607474f2f4844b70b6fe03f3f73d00a293557d51694e9a"} Apr 16 23:53:41.450185 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.450164 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:53:41.494436 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.494411 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:41.494578 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.494450 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:41.500441 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:41.500422 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:42.454203 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:42.454171 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:53:42.498403 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:42.498367 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:53:46.814942 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:46.814906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/init-config-reloader/0.log" Apr 16 23:53:46.820337 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:46.820318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/alertmanager/0.log" Apr 16 23:53:46.978069 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:46.978047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/config-reloader/0.log" Apr 16 23:53:47.178190 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:47.178130 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/kube-rbac-proxy-web/0.log" Apr 16 23:53:47.377891 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:47.377866 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/kube-rbac-proxy/0.log" Apr 16 23:53:47.578519 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:47.578477 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/kube-rbac-proxy-metric/0.log" Apr 16 23:53:47.778225 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:47.778198 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/prom-label-proxy/0.log" Apr 16 23:53:47.979680 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:47.979624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-v78q2_675389d4-0616-4e3c-8d9d-a1d6f5247035/cluster-monitoring-operator/0.log" Apr 16 23:53:48.777633 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:48.777604 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-68ff4cfd57-zmv8h_8e5ccfea-1770-41d3-bf4c-610f04b0b7e0/metrics-server/0.log" Apr 16 23:53:48.978239 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:48.978210 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-55j2w_1c04e984-f558-41cc-852c-fd03622e44c3/monitoring-plugin/0.log" Apr 16 23:53:49.777596 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:49.777572 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/init-textfile/0.log" Apr 16 23:53:49.979764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:49.979739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/node-exporter/0.log" Apr 16 23:53:50.178797 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:50.178730 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/kube-rbac-proxy/0.log" Apr 16 23:53:53.377577 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:53.377532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-w4ff9_fe885835-2420-443f-9c28-6fae79714fb1/prometheus-operator-admission-webhook/0.log" Apr 16 23:53:55.377583 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:55.377557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wwg7j_67418dd9-9c9a-4599-849e-9013809fd4d0/networking-console-plugin/0.log" Apr 16 23:53:55.977855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:55.977831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb9d4d657-whlnw_da8d9d31-0e6c-47ee-829f-b6cad151a3bf/console/0.log" Apr 16 23:53:56.178028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:56.178003 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8bb65b6b-s8dcp_8c715ee4-b5f2-476e-9215-7d90e925a5a7/console/0.log" Apr 16 23:53:56.379587 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:53:56.379562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-88kbv_e3a85d1e-0965-4d09-95ed-fe07834583c7/download-server/0.log" Apr 16 23:54:08.472111 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.472062 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cb9d4d657-whlnw" podUID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" containerName="console" containerID="cri-o://66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1" gracePeriod=15 Apr 16 23:54:08.730790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.730738 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb9d4d657-whlnw_da8d9d31-0e6c-47ee-829f-b6cad151a3bf/console/0.log" Apr 16 23:54:08.730889 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.730805 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:54:08.876571 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876530 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.876690 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.876690 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876656 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.876690 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876680 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mq2\" (UniqueName: \"kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.876839 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876785 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.876888 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.876842 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert\") pod \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\" (UID: \"da8d9d31-0e6c-47ee-829f-b6cad151a3bf\") " Apr 16 23:54:08.877111 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.877079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config" (OuterVolumeSpecName: "console-config") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:08.877111 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.877101 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca" (OuterVolumeSpecName: "service-ca") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:08.877259 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.877185 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:08.878762 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.878744 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:08.879118 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.879102 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2" (OuterVolumeSpecName: "kube-api-access-x4mq2") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "kube-api-access-x4mq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:08.879194 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.879111 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "da8d9d31-0e6c-47ee-829f-b6cad151a3bf" (UID: "da8d9d31-0e6c-47ee-829f-b6cad151a3bf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:08.977818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977796 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-service-ca\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:08.977818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977816 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-oauth-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:08.977952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977826 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:08.977952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977835 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-oauth-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:08.977952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977844 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-console-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:08.977952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:08.977853 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4mq2\" (UniqueName: \"kubernetes.io/projected/da8d9d31-0e6c-47ee-829f-b6cad151a3bf-kube-api-access-x4mq2\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:09.530413 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb9d4d657-whlnw_da8d9d31-0e6c-47ee-829f-b6cad151a3bf/console/0.log" Apr 16 23:54:09.530881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530422 2578 generic.go:358] "Generic (PLEG): container finished" podID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" containerID="66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1" exitCode=2 Apr 16 23:54:09.530881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530481 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb9d4d657-whlnw" Apr 16 23:54:09.530881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb9d4d657-whlnw" event={"ID":"da8d9d31-0e6c-47ee-829f-b6cad151a3bf","Type":"ContainerDied","Data":"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1"} Apr 16 23:54:09.530881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb9d4d657-whlnw" event={"ID":"da8d9d31-0e6c-47ee-829f-b6cad151a3bf","Type":"ContainerDied","Data":"9ea6e316927e83951516a40a95acc84531fa26f827e717932c2ab2cb239ff9b7"} Apr 16 23:54:09.530881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.530581 2578 scope.go:117] "RemoveContainer" containerID="66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1" Apr 16 23:54:09.538662 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.538639 2578 scope.go:117] "RemoveContainer" containerID="66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1" Apr 16 23:54:09.538917 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:09.538888 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1\": container with ID starting with 66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1 not found: ID does not exist" containerID="66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1" Apr 16 23:54:09.538990 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.538928 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1"} err="failed to get container status \"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1\": rpc error: code = NotFound desc = could not find container \"66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1\": container with ID starting with 66b331fe0169d4a60ba8e647326e5fa311fdd911200673c328616969947fced1 not found: ID does not exist" Apr 16 23:54:09.549889 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.549870 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:54:09.553285 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.553269 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cb9d4d657-whlnw"] Apr 16 23:54:09.816863 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:09.816780 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" path="/var/lib/kubelet/pods/da8d9d31-0e6c-47ee-829f-b6cad151a3bf/volumes" Apr 16 23:54:12.206899 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.206863 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:12.207333 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207280 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="alertmanager" containerID="cri-o://5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" gracePeriod=120 Apr 16 23:54:12.207475 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207343 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-web" containerID="cri-o://14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" gracePeriod=120 Apr 16 23:54:12.207475 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207391 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy" containerID="cri-o://d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" gracePeriod=120 Apr 16 23:54:12.207475 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207390 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="config-reloader" containerID="cri-o://1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" gracePeriod=120 Apr 16 23:54:12.207475 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207339 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-metric" containerID="cri-o://c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" gracePeriod=120 Apr 16 23:54:12.207475 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.207400 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="prom-label-proxy" containerID="cri-o://ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" gracePeriod=120 Apr 16 23:54:12.542397 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542370 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" exitCode=0 Apr 16 23:54:12.542397 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542394 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" exitCode=0 Apr 16 23:54:12.542397 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542402 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" exitCode=0 Apr 16 23:54:12.542593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542408 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" exitCode=0 Apr 16 23:54:12.542593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511"} Apr 16 23:54:12.542593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f"} Apr 16 23:54:12.542593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f"} Apr 16 23:54:12.542593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:12.542491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292"} Apr 16 23:54:13.458297 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.458276 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:13.547692 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547616 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" exitCode=0 Apr 16 23:54:13.547692 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547640 2578 generic.go:358] "Generic (PLEG): container finished" podID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerID="14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" exitCode=0 Apr 16 23:54:13.547861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226"} Apr 16 23:54:13.547861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547722 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:13.547861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d"} Apr 16 23:54:13.547861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503","Type":"ContainerDied","Data":"196ad8259e035b553998470e28f2b2871f419a6a8f65638fb6a0a21577bd49a2"} Apr 16 23:54:13.547861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.547767 2578 scope.go:117] "RemoveContainer" containerID="ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" Apr 16 23:54:13.556234 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.556204 2578 scope.go:117] "RemoveContainer" containerID="c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" Apr 16 23:54:13.562675 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.562658 2578 scope.go:117] "RemoveContainer" containerID="d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" Apr 16 23:54:13.568635 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.568608 2578 scope.go:117] "RemoveContainer" containerID="14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" Apr 16 23:54:13.574469 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.574454 2578 scope.go:117] "RemoveContainer" containerID="1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" Apr 16 23:54:13.580387 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.580371 2578 scope.go:117] "RemoveContainer" containerID="5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" Apr 16 23:54:13.586249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.586231 2578 scope.go:117] "RemoveContainer" containerID="91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b" Apr 16 23:54:13.592230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.592212 2578 scope.go:117] "RemoveContainer" containerID="ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" Apr 16 23:54:13.592485 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.592467 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511\": container with ID starting with ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511 not found: ID does not exist" containerID="ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" Apr 16 23:54:13.592577 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.592492 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511"} err="failed to get container status \"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511\": rpc error: code = NotFound desc = could not find container \"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511\": container with ID starting with ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511 not found: ID does not exist" Apr 16 23:54:13.592577 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.592510 2578 scope.go:117] "RemoveContainer" containerID="c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" Apr 16 23:54:13.592761 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.592743 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226\": container with ID starting with c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226 not found: ID does not exist" containerID="c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" Apr 16 23:54:13.592804 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.592767 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226"} err="failed to get container status \"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226\": rpc error: code = NotFound desc = could not find container \"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226\": container with ID starting with c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226 not found: ID does not exist" Apr 16 23:54:13.592804 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.592781 2578 scope.go:117] "RemoveContainer" containerID="d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" Apr 16 23:54:13.592999 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.592985 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f\": container with ID starting with d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f not found: ID does not exist" containerID="d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" Apr 16 23:54:13.593034 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593002 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f"} err="failed to get container status \"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f\": rpc error: code = NotFound desc = could not find container \"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f\": container with ID starting with d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f not found: ID does not exist" Apr 16 23:54:13.593034 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593014 2578 scope.go:117] "RemoveContainer" containerID="14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" Apr 16 23:54:13.593234 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.593218 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d\": container with ID starting with 14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d not found: ID does not exist" containerID="14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" Apr 16 23:54:13.593268 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593239 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d"} err="failed to get container status \"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d\": rpc error: code = NotFound desc = could not find container \"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d\": container with ID starting with 14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d not found: ID does not exist" Apr 16 23:54:13.593268 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593254 2578 scope.go:117] "RemoveContainer" containerID="1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" Apr 16 23:54:13.593483 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.593457 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f\": container with ID starting with 1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f not found: ID does not exist" containerID="1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" Apr 16 23:54:13.593527 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593491 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f"} err="failed to get container status \"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f\": rpc error: code = NotFound desc = could not find container \"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f\": container with ID starting with 1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f not found: ID does not exist" Apr 16 23:54:13.593527 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593507 2578 scope.go:117] "RemoveContainer" containerID="5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" Apr 16 23:54:13.593811 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.593791 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292\": container with ID starting with 5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292 not found: ID does not exist" containerID="5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" Apr 16 23:54:13.593864 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593819 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292"} err="failed to get container status \"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292\": rpc error: code = NotFound desc = could not find container \"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292\": container with ID starting with 5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292 not found: ID does not exist" Apr 16 23:54:13.593864 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.593833 2578 scope.go:117] "RemoveContainer" containerID="91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b" Apr 16 23:54:13.594021 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:13.594005 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b\": container with ID starting with 91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b not found: ID does not exist" containerID="91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b" Apr 16 23:54:13.594061 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594024 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b"} err="failed to get container status \"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b\": rpc error: code = NotFound desc = could not find container \"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b\": container with ID starting with 91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b not found: ID does not exist" Apr 16 23:54:13.594061 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594036 2578 scope.go:117] "RemoveContainer" containerID="ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511" Apr 16 23:54:13.594249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594231 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511"} err="failed to get container status \"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511\": rpc error: code = NotFound desc = could not find container \"ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511\": container with ID starting with ae9ef72809a3b8f03ba969f29503df476d33e60ed2d6844dce1af0c89894e511 not found: ID does not exist" Apr 16 23:54:13.594291 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594250 2578 scope.go:117] "RemoveContainer" containerID="c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226" Apr 16 23:54:13.594462 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594443 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226"} err="failed to get container status \"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226\": rpc error: code = NotFound desc = could not find container \"c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226\": container with ID starting with c22ca418592b0587d053e419fbd955ac4d15c3a338d21d3e75226c3f37e61226 not found: ID does not exist" Apr 16 23:54:13.594531 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594465 2578 scope.go:117] "RemoveContainer" containerID="d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f" Apr 16 23:54:13.594694 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594677 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f"} err="failed to get container status \"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f\": rpc error: code = NotFound desc = could not find container \"d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f\": container with ID starting with d9a371464b66041d711891c71c361cef211a9e2b754f285ffeb5635ed2a73c9f not found: ID does not exist" Apr 16 23:54:13.594748 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594693 2578 scope.go:117] "RemoveContainer" containerID="14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d" Apr 16 23:54:13.594899 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594878 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d"} err="failed to get container status \"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d\": rpc error: code = NotFound desc = could not find container \"14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d\": container with ID starting with 14f4e84a384aa9f5a785963f123c0970df8b959120081c7fbbac4ebc4a4b2a6d not found: ID does not exist" Apr 16 23:54:13.594899 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.594897 2578 scope.go:117] "RemoveContainer" containerID="1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f" Apr 16 23:54:13.595122 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.595107 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f"} err="failed to get container status \"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f\": rpc error: code = NotFound desc = could not find container \"1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f\": container with ID starting with 1bd7919fb7b8181e3c3efa9acab47ca746bfc61c95ab4d413bd354c3f2595f2f not found: ID does not exist" Apr 16 23:54:13.595172 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.595122 2578 scope.go:117] "RemoveContainer" containerID="5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292" Apr 16 23:54:13.595327 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.595311 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292"} err="failed to get container status \"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292\": rpc error: code = NotFound desc = could not find container \"5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292\": container with ID starting with 5409e7f842a5db4dbddc8a42f0894400ee85924724095a6e2c62d19a56be2292 not found: ID does not exist" Apr 16 23:54:13.595374 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.595327 2578 scope.go:117] "RemoveContainer" containerID="91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b" Apr 16 23:54:13.595523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.595505 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b"} err="failed to get container status \"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b\": rpc error: code = NotFound desc = could not find container \"91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b\": container with ID starting with 91060771927fb128069cf4796261c4f865112fecabf516eb0c2a5e78fff8e95b not found: ID does not exist" Apr 16 23:54:13.614796 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614774 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.614871 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614801 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.614871 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614820 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.614871 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614844 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.614871 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614865 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614890 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6zc\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614922 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614950 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.614988 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615037 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615072 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615403 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615403 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615141 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric\") pod \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\" (UID: \"adb0350f-8ad2-441c-b9ae-f4ed5e0f1503\") " Apr 16 23:54:13.615403 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615238 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:13.615403 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615242 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:13.615632 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615438 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.615632 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615457 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-metrics-client-ca\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.615834 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.615798 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:54:13.617659 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.617636 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.618100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.617961 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:13.618100 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.618064 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc" (OuterVolumeSpecName: "kube-api-access-hw6zc") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "kube-api-access-hw6zc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:13.618375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.618342 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume" (OuterVolumeSpecName: "config-volume") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.618818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.618792 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out" (OuterVolumeSpecName: "config-out") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:54:13.619120 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.619089 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.619410 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.619395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.619773 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.619749 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.622470 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.622447 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.628631 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.628608 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config" (OuterVolumeSpecName: "web-config") pod "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" (UID: "adb0350f-8ad2-441c-b9ae-f4ed5e0f1503"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:13.715946 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715918 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-tls-assets\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.715946 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715941 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-main-tls\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715953 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715962 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-cluster-tls-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715972 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715983 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-out\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715991 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hw6zc\" (UniqueName: \"kubernetes.io/projected/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-kube-api-access-hw6zc\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.715999 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-web-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.716008 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.716016 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-config-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.716094 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.716024 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503-alertmanager-main-db\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:13.866919 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.866802 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:13.870757 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.870735 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:13.898239 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898218 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:13.898479 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898466 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="alertmanager" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898481 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="alertmanager" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898495 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="config-reloader" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898500 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="config-reloader" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898510 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="init-config-reloader" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898515 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="init-config-reloader" Apr 16 23:54:13.898523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898521 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-web" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898527 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-web" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898533 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" containerName="console" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898552 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" containerName="console" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898566 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898572 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898579 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="prom-label-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898584 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="prom-label-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898589 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-metric" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898594 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-metric" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898636 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="config-reloader" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898643 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="prom-label-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898651 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-web" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898658 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898664 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="alertmanager" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898670 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="da8d9d31-0e6c-47ee-829f-b6cad151a3bf" containerName="console" Apr 16 23:54:13.898715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.898676 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" containerName="kube-rbac-proxy-metric" Apr 16 23:54:13.924856 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.924837 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:13.924943 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.924935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:13.927647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927486 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 23:54:13.927647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927492 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 23:54:13.927647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927588 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 23:54:13.927647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 23:54:13.927647 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927638 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 23:54:13.927936 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8s72g\"" Apr 16 23:54:13.927995 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.927941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 23:54:13.928045 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.928008 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 23:54:13.928099 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.928076 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 23:54:13.932914 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:13.932894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 23:54:14.017644 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017780 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017931 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017805 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017931 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.017931 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.018028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.018028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.018028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.018028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.017987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.018028 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.018008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b694\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-kube-api-access-6b694\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118513 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118513 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118513 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b694\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-kube-api-access-6b694\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.118980 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.118957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119112 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119302 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119377 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119431 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119431 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119563 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.119563 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.119490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.121585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.121585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121560 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.121753 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.121753 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.121882 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.122005 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.121985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.122176 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.122159 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.122340 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.122325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.123335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.123320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.126046 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.126029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b694\" (UniqueName: \"kubernetes.io/projected/eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e-kube-api-access-6b694\") pod \"alertmanager-main-0\" (UID: \"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.234398 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.234375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:54:14.359860 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.359803 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:54:14.362194 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:54:14.362162 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2d28f9_8f9b_4833_b6b1_ff0e0e34458e.slice/crio-5e363b17daff86796393249069408de1d26bd39d0733143853219e186b67aa5f WatchSource:0}: Error finding container 5e363b17daff86796393249069408de1d26bd39d0733143853219e186b67aa5f: Status 404 returned error can't find the container with id 5e363b17daff86796393249069408de1d26bd39d0733143853219e186b67aa5f Apr 16 23:54:14.556895 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.556863 2578 generic.go:358] "Generic (PLEG): container finished" podID="eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e" containerID="df7b5af7e1c312cb6f6c5c23fa5eebe54c007101ead05a91f422f3e08a7d57ea" exitCode=0 Apr 16 23:54:14.557174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.556953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerDied","Data":"df7b5af7e1c312cb6f6c5c23fa5eebe54c007101ead05a91f422f3e08a7d57ea"} Apr 16 23:54:14.557174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:14.556979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"5e363b17daff86796393249069408de1d26bd39d0733143853219e186b67aa5f"} Apr 16 23:54:15.086073 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.086043 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:54:15.120501 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.120475 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:54:15.120660 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.120616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229141 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229141 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229268 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229359 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229390 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229422 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ln8n\" (UniqueName: \"kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.229457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.229423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ln8n\" (UniqueName: \"kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330244 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330800 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330921 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.330921 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.330903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.331234 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.331212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.332615 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.332594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.332704 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.332665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.336766 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.336745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ln8n\" (UniqueName: \"kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n\") pod \"console-5bcbddfb6c-6ffgl\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.428958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.428932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:15.559460 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.559435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:54:15.561987 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:54:15.561962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2168ed9f_ae27_4544_ae23_ce14d7d6f640.slice/crio-ff8f2d16e5954a14492c40259c616460d4aea8451983937cc1fd8aef03e1938e WatchSource:0}: Error finding container ff8f2d16e5954a14492c40259c616460d4aea8451983937cc1fd8aef03e1938e: Status 404 returned error can't find the container with id ff8f2d16e5954a14492c40259c616460d4aea8451983937cc1fd8aef03e1938e Apr 16 23:54:15.562855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"d07102f68f018c5edc2f61806179288f3fb2d08f11e4197a5bbd30ec8b2878bb"} Apr 16 23:54:15.562911 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"c795c21f3d40b89723ecaf704e335c0af922e618c6a1a556512a0c6eab3e2445"} Apr 16 23:54:15.562911 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"3af30db4b66abfbe28d982b1e93ff7ba5cb4c721daedbc3e7cdc248ee9e62dd0"} Apr 16 23:54:15.562911 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"81e786d206b6dc42c587755900bed6e306c830299b20ff8002f9a3d510f6a10b"} Apr 16 23:54:15.562911 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"2e62a0e550f5ec5651f4f13cded1dff5bd58b03ee62a276bb595c461cff1500a"} Apr 16 23:54:15.563042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.562918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e","Type":"ContainerStarted","Data":"3ed0fd36bfd8ca8d0e2c9863511ae46e78817b3a4e483df912e79dc2cc173372"} Apr 16 23:54:15.586981 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.586944 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.586930909 podStartE2EDuration="2.586930909s" podCreationTimestamp="2026-04-16 23:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:54:15.584725804 +0000 UTC m=+236.359750690" watchObservedRunningTime="2026-04-16 23:54:15.586930909 +0000 UTC m=+236.361955840" Apr 16 23:54:15.819312 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:15.819280 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb0350f-8ad2-441c-b9ae-f4ed5e0f1503" path="/var/lib/kubelet/pods/adb0350f-8ad2-441c-b9ae-f4ed5e0f1503/volumes" Apr 16 23:54:16.567508 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:16.567475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcbddfb6c-6ffgl" event={"ID":"2168ed9f-ae27-4544-ae23-ce14d7d6f640","Type":"ContainerStarted","Data":"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f"} Apr 16 23:54:16.567508 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:16.567508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcbddfb6c-6ffgl" event={"ID":"2168ed9f-ae27-4544-ae23-ce14d7d6f640","Type":"ContainerStarted","Data":"ff8f2d16e5954a14492c40259c616460d4aea8451983937cc1fd8aef03e1938e"} Apr 16 23:54:16.583859 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:16.583810 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bcbddfb6c-6ffgl" podStartSLOduration=1.5837971130000001 podStartE2EDuration="1.583797113s" podCreationTimestamp="2026-04-16 23:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:54:16.582643517 +0000 UTC m=+237.357668401" watchObservedRunningTime="2026-04-16 23:54:16.583797113 +0000 UTC m=+237.358821999" Apr 16 23:54:25.429314 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:25.429264 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:25.429314 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:25.429313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:25.434079 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:25.434054 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:25.596258 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:25.596229 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:54:25.642502 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:25.642466 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:54:31.552285 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:31.552199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:54:31.554351 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:31.554331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bac2109e-d2f6-42aa-94c6-73a79a2012f0-metrics-certs\") pod \"network-metrics-daemon-4sczf\" (UID: \"bac2109e-d2f6-42aa-94c6-73a79a2012f0\") " pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:54:31.616706 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:31.616679 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:54:31.624619 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:31.624602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sczf" Apr 16 23:54:31.738130 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:31.738094 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4sczf"] Apr 16 23:54:31.740956 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:54:31.740929 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac2109e_d2f6_42aa_94c6_73a79a2012f0.slice/crio-a92993bd1ef7c669b0faab6fffc7f5e2eabad53e432a76da07e9ed02dcd240a8 WatchSource:0}: Error finding container a92993bd1ef7c669b0faab6fffc7f5e2eabad53e432a76da07e9ed02dcd240a8: Status 404 returned error can't find the container with id a92993bd1ef7c669b0faab6fffc7f5e2eabad53e432a76da07e9ed02dcd240a8 Apr 16 23:54:32.616809 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:32.616770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sczf" event={"ID":"bac2109e-d2f6-42aa-94c6-73a79a2012f0","Type":"ContainerStarted","Data":"a92993bd1ef7c669b0faab6fffc7f5e2eabad53e432a76da07e9ed02dcd240a8"} Apr 16 23:54:33.628791 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:33.628752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sczf" event={"ID":"bac2109e-d2f6-42aa-94c6-73a79a2012f0","Type":"ContainerStarted","Data":"7c7cfe8f7e547864fc35b849170bde43396330d2fea9b7c35de30effe74a3e05"} Apr 16 23:54:33.628791 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:33.628793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sczf" event={"ID":"bac2109e-d2f6-42aa-94c6-73a79a2012f0","Type":"ContainerStarted","Data":"9239b1789b6f34d385aa9a89befe4ffd37c1df6c4bbc3d2f2ef25e97c8508345"} Apr 16 23:54:33.643792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:33.643746 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4sczf" podStartSLOduration=253.793589194 podStartE2EDuration="4m14.643733728s" podCreationTimestamp="2026-04-16 23:50:19 +0000 UTC" firstStartedPulling="2026-04-16 23:54:31.742758227 +0000 UTC m=+252.517783090" lastFinishedPulling="2026-04-16 23:54:32.592902759 +0000 UTC m=+253.367927624" observedRunningTime="2026-04-16 23:54:33.642359696 +0000 UTC m=+254.417384582" watchObservedRunningTime="2026-04-16 23:54:33.643733728 +0000 UTC m=+254.418758614" Apr 16 23:54:50.663141 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.663100 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f8bb65b6b-s8dcp" podUID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" containerName="console" containerID="cri-o://5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3" gracePeriod=15 Apr 16 23:54:50.898942 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.898921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8bb65b6b-s8dcp_8c715ee4-b5f2-476e-9215-7d90e925a5a7/console/0.log" Apr 16 23:54:50.899055 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.898980 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:54:50.987199 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987126 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqtl9\" (UniqueName: \"kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987199 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987163 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987207 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987229 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987292 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987325 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config\") pod \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\" (UID: \"8c715ee4-b5f2-476e-9215-7d90e925a5a7\") " Apr 16 23:54:50.987854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987786 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config" (OuterVolumeSpecName: "console-config") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:50.987946 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987829 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:50.988007 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.987950 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:50.988136 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.988113 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:50.989358 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.989332 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9" (OuterVolumeSpecName: "kube-api-access-cqtl9") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "kube-api-access-cqtl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:50.989500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.989476 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:50.989575 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:50.989506 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c715ee4-b5f2-476e-9215-7d90e925a5a7" (UID: "8c715ee4-b5f2-476e-9215-7d90e925a5a7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:51.088047 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088025 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-service-ca\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088047 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088048 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-trusted-ca-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088058 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-oauth-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088068 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088078 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088087 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqtl9\" (UniqueName: \"kubernetes.io/projected/8c715ee4-b5f2-476e-9215-7d90e925a5a7-kube-api-access-cqtl9\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.088186 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.088096 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c715ee4-b5f2-476e-9215-7d90e925a5a7-console-oauth-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:54:51.683413 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683387 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8bb65b6b-s8dcp_8c715ee4-b5f2-476e-9215-7d90e925a5a7/console/0.log" Apr 16 23:54:51.683789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683424 2578 generic.go:358] "Generic (PLEG): container finished" podID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" containerID="5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3" exitCode=2 Apr 16 23:54:51.683789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8bb65b6b-s8dcp" event={"ID":"8c715ee4-b5f2-476e-9215-7d90e925a5a7","Type":"ContainerDied","Data":"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3"} Apr 16 23:54:51.683789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8bb65b6b-s8dcp" event={"ID":"8c715ee4-b5f2-476e-9215-7d90e925a5a7","Type":"ContainerDied","Data":"1e087635c74dfa3bfe74d6258b06b3a7991da4e2232507c6b0caae9f66ff64a6"} Apr 16 23:54:51.683789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683484 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8bb65b6b-s8dcp" Apr 16 23:54:51.683789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.683497 2578 scope.go:117] "RemoveContainer" containerID="5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3" Apr 16 23:54:51.691610 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.691595 2578 scope.go:117] "RemoveContainer" containerID="5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3" Apr 16 23:54:51.691839 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:54:51.691817 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3\": container with ID starting with 5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3 not found: ID does not exist" containerID="5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3" Apr 16 23:54:51.691903 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.691846 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3"} err="failed to get container status \"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3\": rpc error: code = NotFound desc = could not find container \"5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3\": container with ID starting with 5bd1f1129fc0e1977af6fb065abdcd5505868ebf2783d4c2b5e0a0d426a7d1e3 not found: ID does not exist" Apr 16 23:54:51.704110 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.704084 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:54:51.707410 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.707393 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f8bb65b6b-s8dcp"] Apr 16 23:54:51.817240 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:54:51.817215 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" path="/var/lib/kubelet/pods/8c715ee4-b5f2-476e-9215-7d90e925a5a7/volumes" Apr 16 23:55:08.206764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.206723 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf"] Apr 16 23:55:08.207220 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.207148 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" containerName="console" Apr 16 23:55:08.207220 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.207166 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" containerName="console" Apr 16 23:55:08.207338 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.207261 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c715ee4-b5f2-476e-9215-7d90e925a5a7" containerName="console" Apr 16 23:55:08.211720 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.211697 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.214241 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.214222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:55:08.214331 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.214257 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:55:08.215138 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.215124 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:55:08.217587 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.217568 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf"] Apr 16 23:55:08.306950 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.306918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.307064 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.306952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65q8x\" (UniqueName: \"kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.307064 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.307006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.407667 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.407641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.407797 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.407784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.407840 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.407807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65q8x\" (UniqueName: \"kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.408002 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.407984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.408129 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.408110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.417881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.417855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65q8x\" (UniqueName: \"kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.521996 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.521975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:08.636146 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.636123 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf"] Apr 16 23:55:08.638253 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:55:08.638221 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04e57d9d_7637_496d_8dca_b0ac77ce6ca7.slice/crio-e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26 WatchSource:0}: Error finding container e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26: Status 404 returned error can't find the container with id e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26 Apr 16 23:55:08.731812 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:08.731781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" event={"ID":"04e57d9d-7637-496d-8dca-b0ac77ce6ca7","Type":"ContainerStarted","Data":"e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26"} Apr 16 23:55:14.750032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:14.749992 2578 generic.go:358] "Generic (PLEG): container finished" podID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerID="bec1fb9a655bb6be072f071d837929f819ef45813d523bf19730d46e83d1fec7" exitCode=0 Apr 16 23:55:14.750431 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:14.750076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" event={"ID":"04e57d9d-7637-496d-8dca-b0ac77ce6ca7","Type":"ContainerDied","Data":"bec1fb9a655bb6be072f071d837929f819ef45813d523bf19730d46e83d1fec7"} Apr 16 23:55:17.760977 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:17.760941 2578 generic.go:358] "Generic (PLEG): container finished" podID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerID="d71b1b607295adfb17e89897b490494821632f7b8747ccdc3b1be3dee7f2763f" exitCode=0 Apr 16 23:55:17.761363 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:17.761036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" event={"ID":"04e57d9d-7637-496d-8dca-b0ac77ce6ca7","Type":"ContainerDied","Data":"d71b1b607295adfb17e89897b490494821632f7b8747ccdc3b1be3dee7f2763f"} Apr 16 23:55:19.739785 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:19.739750 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:55:19.740248 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:19.739988 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 16 23:55:19.747375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:19.747356 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 23:55:24.782335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:24.782302 2578 generic.go:358] "Generic (PLEG): container finished" podID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerID="d48f2d4a496522a498b44069d1967a870743c104c2b08a13e85882ed8d068263" exitCode=0 Apr 16 23:55:24.782335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:24.782339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" event={"ID":"04e57d9d-7637-496d-8dca-b0ac77ce6ca7","Type":"ContainerDied","Data":"d48f2d4a496522a498b44069d1967a870743c104c2b08a13e85882ed8d068263"} Apr 16 23:55:25.906587 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:25.906565 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:26.055699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.055620 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65q8x\" (UniqueName: \"kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x\") pod \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " Apr 16 23:55:26.055699 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.055654 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle\") pod \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " Apr 16 23:55:26.055913 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.055713 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util\") pod \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\" (UID: \"04e57d9d-7637-496d-8dca-b0ac77ce6ca7\") " Apr 16 23:55:26.056261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.056235 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle" (OuterVolumeSpecName: "bundle") pod "04e57d9d-7637-496d-8dca-b0ac77ce6ca7" (UID: "04e57d9d-7637-496d-8dca-b0ac77ce6ca7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:55:26.057717 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.057684 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x" (OuterVolumeSpecName: "kube-api-access-65q8x") pod "04e57d9d-7637-496d-8dca-b0ac77ce6ca7" (UID: "04e57d9d-7637-496d-8dca-b0ac77ce6ca7"). InnerVolumeSpecName "kube-api-access-65q8x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:55:26.059498 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.059477 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util" (OuterVolumeSpecName: "util") pod "04e57d9d-7637-496d-8dca-b0ac77ce6ca7" (UID: "04e57d9d-7637-496d-8dca-b0ac77ce6ca7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:55:26.156792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.156757 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65q8x\" (UniqueName: \"kubernetes.io/projected/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-kube-api-access-65q8x\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:26.156792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.156789 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:26.156792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.156799 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e57d9d-7637-496d-8dca-b0ac77ce6ca7-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:26.789902 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.789869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" event={"ID":"04e57d9d-7637-496d-8dca-b0ac77ce6ca7","Type":"ContainerDied","Data":"e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26"} Apr 16 23:55:26.789902 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.789890 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m4knf" Apr 16 23:55:26.789902 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:26.789905 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d8711539c35644b853730655c6d10a2ff8dae3518ec69449017454c5133d26" Apr 16 23:55:30.504485 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504451 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5"] Apr 16 23:55:30.504846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504812 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="pull" Apr 16 23:55:30.504846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504824 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="pull" Apr 16 23:55:30.504846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504837 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="util" Apr 16 23:55:30.504846 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504843 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="util" Apr 16 23:55:30.504996 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504870 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="extract" Apr 16 23:55:30.504996 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504876 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="extract" Apr 16 23:55:30.504996 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.504921 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="04e57d9d-7637-496d-8dca-b0ac77ce6ca7" containerName="extract" Apr 16 23:55:30.514637 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.514615 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.517649 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.517617 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qfgsw\"" Apr 16 23:55:30.517760 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.517664 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 23:55:30.517860 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.517822 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:55:30.523387 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.523293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5"] Apr 16 23:55:30.689249 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.689221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42ab0e22-d533-4007-8d6d-5138d12e8d07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.689389 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.689254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zhn\" (UniqueName: \"kubernetes.io/projected/42ab0e22-d533-4007-8d6d-5138d12e8d07-kube-api-access-g8zhn\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.790331 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.790272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42ab0e22-d533-4007-8d6d-5138d12e8d07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.790331 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.790307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zhn\" (UniqueName: \"kubernetes.io/projected/42ab0e22-d533-4007-8d6d-5138d12e8d07-kube-api-access-g8zhn\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.790730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.790713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42ab0e22-d533-4007-8d6d-5138d12e8d07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.800652 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.800629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zhn\" (UniqueName: \"kubernetes.io/projected/42ab0e22-d533-4007-8d6d-5138d12e8d07-kube-api-access-g8zhn\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-kd5x5\" (UID: \"42ab0e22-d533-4007-8d6d-5138d12e8d07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.826632 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.826611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" Apr 16 23:55:30.952876 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.952843 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5"] Apr 16 23:55:30.956343 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:55:30.956313 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ab0e22_d533_4007_8d6d_5138d12e8d07.slice/crio-5301a7b1168b24aee35d39f7c7eec57824a3c9a5afede80d2fdc7558939fa76d WatchSource:0}: Error finding container 5301a7b1168b24aee35d39f7c7eec57824a3c9a5afede80d2fdc7558939fa76d: Status 404 returned error can't find the container with id 5301a7b1168b24aee35d39f7c7eec57824a3c9a5afede80d2fdc7558939fa76d Apr 16 23:55:30.959311 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:30.959295 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:55:31.805712 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:31.805673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" event={"ID":"42ab0e22-d533-4007-8d6d-5138d12e8d07","Type":"ContainerStarted","Data":"5301a7b1168b24aee35d39f7c7eec57824a3c9a5afede80d2fdc7558939fa76d"} Apr 16 23:55:34.816247 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:34.816216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" event={"ID":"42ab0e22-d533-4007-8d6d-5138d12e8d07","Type":"ContainerStarted","Data":"080486a5f554ea85e9d75cbf9b2d3be1bc9fb5d2b82255502749dfa1693c65f7"} Apr 16 23:55:34.834869 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:34.834829 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-kd5x5" podStartSLOduration=1.9953697830000001 podStartE2EDuration="4.834815275s" podCreationTimestamp="2026-04-16 23:55:30 +0000 UTC" firstStartedPulling="2026-04-16 23:55:30.959442819 +0000 UTC m=+311.734467683" lastFinishedPulling="2026-04-16 23:55:33.79888831 +0000 UTC m=+314.573913175" observedRunningTime="2026-04-16 23:55:34.833105252 +0000 UTC m=+315.608130148" watchObservedRunningTime="2026-04-16 23:55:34.834815275 +0000 UTC m=+315.609840161" Apr 16 23:55:36.357593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.357560 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc"] Apr 16 23:55:36.361017 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.361000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.363580 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.363527 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:55:36.363716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.363627 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:55:36.363716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.363638 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:55:36.368660 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.368639 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc"] Apr 16 23:55:36.537810 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.537774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.537939 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.537821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptfw\" (UniqueName: \"kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.537939 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.537906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.638393 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.638328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.638393 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.638363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cptfw\" (UniqueName: \"kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.638562 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.638398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.638684 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.638664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.638731 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.638701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.654034 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.654003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptfw\" (UniqueName: \"kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.670835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.670807 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:36.788506 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.788481 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc"] Apr 16 23:55:36.790630 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:55:36.790600 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842d5c0a_b711_4e56_b40a_e8d59265ed12.slice/crio-5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9 WatchSource:0}: Error finding container 5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9: Status 404 returned error can't find the container with id 5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9 Apr 16 23:55:36.822411 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:36.822388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerStarted","Data":"5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9"} Apr 16 23:55:37.826148 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:37.826115 2578 generic.go:358] "Generic (PLEG): container finished" podID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerID="139532e01bbfe921d230071532061a3d8b69dbd58f491fe8d5e526c7e6b81c41" exitCode=0 Apr 16 23:55:37.826516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:37.826207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerDied","Data":"139532e01bbfe921d230071532061a3d8b69dbd58f491fe8d5e526c7e6b81c41"} Apr 16 23:55:38.957294 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.957266 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-7kbh2"] Apr 16 23:55:38.960243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.960226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:38.962639 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.962616 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 23:55:38.963875 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.963848 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 23:55:38.963990 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.963905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-gwpck\"" Apr 16 23:55:38.966361 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:38.966324 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-7kbh2"] Apr 16 23:55:39.056818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.056782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l59zh\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-kube-api-access-l59zh\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.056987 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.056930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.158285 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.158243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.158448 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.158329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l59zh\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-kube-api-access-l59zh\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.166418 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.166389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.166641 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.166620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l59zh\" (UniqueName: \"kubernetes.io/projected/2edbd4d4-81a2-4c07-93b4-c2b53d61995d-kube-api-access-l59zh\") pod \"cert-manager-webhook-597b96b99b-7kbh2\" (UID: \"2edbd4d4-81a2-4c07-93b4-c2b53d61995d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.277822 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.277792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:39.783730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.783706 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-7kbh2"] Apr 16 23:55:39.788775 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:55:39.788745 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2edbd4d4_81a2_4c07_93b4_c2b53d61995d.slice/crio-fa9d936d0cbfd3ba10eca9ef14f8d652768f4a8020f69e5efd4d0132161446cd WatchSource:0}: Error finding container fa9d936d0cbfd3ba10eca9ef14f8d652768f4a8020f69e5efd4d0132161446cd: Status 404 returned error can't find the container with id fa9d936d0cbfd3ba10eca9ef14f8d652768f4a8020f69e5efd4d0132161446cd Apr 16 23:55:39.834321 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.834279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerStarted","Data":"a09255a896a7308ea69b4d2e0f0e9898d418de5bcfee4758ed1722855ef1e11e"} Apr 16 23:55:39.835553 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:39.835505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" event={"ID":"2edbd4d4-81a2-4c07-93b4-c2b53d61995d","Type":"ContainerStarted","Data":"fa9d936d0cbfd3ba10eca9ef14f8d652768f4a8020f69e5efd4d0132161446cd"} Apr 16 23:55:40.840711 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:40.840671 2578 generic.go:358] "Generic (PLEG): container finished" podID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerID="a09255a896a7308ea69b4d2e0f0e9898d418de5bcfee4758ed1722855ef1e11e" exitCode=0 Apr 16 23:55:40.841104 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:40.840726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerDied","Data":"a09255a896a7308ea69b4d2e0f0e9898d418de5bcfee4758ed1722855ef1e11e"} Apr 16 23:55:41.845831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:41.845794 2578 generic.go:358] "Generic (PLEG): container finished" podID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerID="42ca00e8154b0fd80fd59773133cb722363d99770ba9107d1e609582f7bc57fc" exitCode=0 Apr 16 23:55:41.846258 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:41.845858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerDied","Data":"42ca00e8154b0fd80fd59773133cb722363d99770ba9107d1e609582f7bc57fc"} Apr 16 23:55:42.977408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:42.977383 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:43.089371 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.089339 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util\") pod \"842d5c0a-b711-4e56-b40a-e8d59265ed12\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " Apr 16 23:55:43.089507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.089381 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle\") pod \"842d5c0a-b711-4e56-b40a-e8d59265ed12\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " Apr 16 23:55:43.089507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.089441 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cptfw\" (UniqueName: \"kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw\") pod \"842d5c0a-b711-4e56-b40a-e8d59265ed12\" (UID: \"842d5c0a-b711-4e56-b40a-e8d59265ed12\") " Apr 16 23:55:43.089811 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.089781 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle" (OuterVolumeSpecName: "bundle") pod "842d5c0a-b711-4e56-b40a-e8d59265ed12" (UID: "842d5c0a-b711-4e56-b40a-e8d59265ed12"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:55:43.091461 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.091431 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw" (OuterVolumeSpecName: "kube-api-access-cptfw") pod "842d5c0a-b711-4e56-b40a-e8d59265ed12" (UID: "842d5c0a-b711-4e56-b40a-e8d59265ed12"). InnerVolumeSpecName "kube-api-access-cptfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:55:43.093691 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.093648 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util" (OuterVolumeSpecName: "util") pod "842d5c0a-b711-4e56-b40a-e8d59265ed12" (UID: "842d5c0a-b711-4e56-b40a-e8d59265ed12"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:55:43.190630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.190575 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cptfw\" (UniqueName: \"kubernetes.io/projected/842d5c0a-b711-4e56-b40a-e8d59265ed12-kube-api-access-cptfw\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:43.190630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.190596 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:43.190630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.190606 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/842d5c0a-b711-4e56-b40a-e8d59265ed12-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:55:43.853414 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.853382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" event={"ID":"842d5c0a-b711-4e56-b40a-e8d59265ed12","Type":"ContainerDied","Data":"5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9"} Apr 16 23:55:43.853414 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.853416 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c49bcfcd4f4dded6b88a3313d7a3e165752168ee7e4e94ad9dc03460da7fcc9" Apr 16 23:55:43.853656 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.853423 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f99hwc" Apr 16 23:55:43.854849 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.854822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" event={"ID":"2edbd4d4-81a2-4c07-93b4-c2b53d61995d","Type":"ContainerStarted","Data":"1b59a5edbdf1b420636b6d86fc574ec7023808e011ff7e7bbd4fe309e49c8abb"} Apr 16 23:55:43.854960 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.854881 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:43.875309 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:43.870645 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" podStartSLOduration=2.8115653849999998 podStartE2EDuration="5.87063009s" podCreationTimestamp="2026-04-16 23:55:38 +0000 UTC" firstStartedPulling="2026-04-16 23:55:39.790721461 +0000 UTC m=+320.565746325" lastFinishedPulling="2026-04-16 23:55:42.849786151 +0000 UTC m=+323.624811030" observedRunningTime="2026-04-16 23:55:43.870398727 +0000 UTC m=+324.645423615" watchObservedRunningTime="2026-04-16 23:55:43.87063009 +0000 UTC m=+324.645654977" Apr 16 23:55:49.859782 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:49.859749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-7kbh2" Apr 16 23:55:55.796428 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796393 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc"] Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796716 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="extract" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796729 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="extract" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796743 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="pull" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796749 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="pull" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796765 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="util" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796770 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="util" Apr 16 23:55:55.796831 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.796825 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="842d5c0a-b711-4e56-b40a-e8d59265ed12" containerName="extract" Apr 16 23:55:55.801482 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.801466 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.805314 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.805286 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:55:55.805529 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.805480 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:55:55.805761 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.805650 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:55:55.806380 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.806359 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc"] Apr 16 23:55:55.880554 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.880515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.880661 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.880583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.880661 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.880617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvfj\" (UniqueName: \"kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.982044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.982017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.982156 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.982071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.982156 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.982112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvfj\" (UniqueName: \"kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.982400 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.982381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.982472 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.982450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:55.989250 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:55.989221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvfj\" (UniqueName: \"kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:56.111049 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:56.110988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:55:56.228732 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:56.228680 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc"] Apr 16 23:55:56.230795 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:55:56.230761 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795cd15a_38bd_487b_bc2f_fa63af3671f6.slice/crio-2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25 WatchSource:0}: Error finding container 2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25: Status 404 returned error can't find the container with id 2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25 Apr 16 23:55:56.891972 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:56.891938 2578 generic.go:358] "Generic (PLEG): container finished" podID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerID="79476280fab5e3a1291a13c56eaeccc5ba71f59c07c6141ac42f19805230b845" exitCode=0 Apr 16 23:55:56.892398 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:56.892024 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" event={"ID":"795cd15a-38bd-487b-bc2f-fa63af3671f6","Type":"ContainerDied","Data":"79476280fab5e3a1291a13c56eaeccc5ba71f59c07c6141ac42f19805230b845"} Apr 16 23:55:56.892398 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:56.892062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" event={"ID":"795cd15a-38bd-487b-bc2f-fa63af3671f6","Type":"ContainerStarted","Data":"2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25"} Apr 16 23:55:57.896289 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:57.896211 2578 generic.go:358] "Generic (PLEG): container finished" podID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerID="d956d88c5f41c4d57211a95aaff3a4914a17602f0301381376411e291fae29fb" exitCode=0 Apr 16 23:55:57.896682 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:57.896297 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" event={"ID":"795cd15a-38bd-487b-bc2f-fa63af3671f6","Type":"ContainerDied","Data":"d956d88c5f41c4d57211a95aaff3a4914a17602f0301381376411e291fae29fb"} Apr 16 23:55:58.901441 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:58.901411 2578 generic.go:358] "Generic (PLEG): container finished" podID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerID="8f23b1b832c55e9f68cdcb26fcaee3db69307d395c6dda6a5acd18af09f17d3f" exitCode=0 Apr 16 23:55:58.901810 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:55:58.901481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" event={"ID":"795cd15a-38bd-487b-bc2f-fa63af3671f6","Type":"ContainerDied","Data":"8f23b1b832c55e9f68cdcb26fcaee3db69307d395c6dda6a5acd18af09f17d3f"} Apr 16 23:56:00.029862 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.029831 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:56:00.117114 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.117087 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle\") pod \"795cd15a-38bd-487b-bc2f-fa63af3671f6\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " Apr 16 23:56:00.117293 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.117146 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util\") pod \"795cd15a-38bd-487b-bc2f-fa63af3671f6\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " Apr 16 23:56:00.117293 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.117190 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vvfj\" (UniqueName: \"kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj\") pod \"795cd15a-38bd-487b-bc2f-fa63af3671f6\" (UID: \"795cd15a-38bd-487b-bc2f-fa63af3671f6\") " Apr 16 23:56:00.117841 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.117817 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle" (OuterVolumeSpecName: "bundle") pod "795cd15a-38bd-487b-bc2f-fa63af3671f6" (UID: "795cd15a-38bd-487b-bc2f-fa63af3671f6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:00.119199 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.119175 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj" (OuterVolumeSpecName: "kube-api-access-4vvfj") pod "795cd15a-38bd-487b-bc2f-fa63af3671f6" (UID: "795cd15a-38bd-487b-bc2f-fa63af3671f6"). InnerVolumeSpecName "kube-api-access-4vvfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:56:00.122308 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.122266 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util" (OuterVolumeSpecName: "util") pod "795cd15a-38bd-487b-bc2f-fa63af3671f6" (UID: "795cd15a-38bd-487b-bc2f-fa63af3671f6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:00.218217 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.218141 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:00.218217 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.218169 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/795cd15a-38bd-487b-bc2f-fa63af3671f6-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:00.218217 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.218180 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vvfj\" (UniqueName: \"kubernetes.io/projected/795cd15a-38bd-487b-bc2f-fa63af3671f6-kube-api-access-4vvfj\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:00.908377 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.908341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" event={"ID":"795cd15a-38bd-487b-bc2f-fa63af3671f6","Type":"ContainerDied","Data":"2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25"} Apr 16 23:56:00.908377 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.908381 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d7f77b03f76d842ac8f162bca7f066804592b453b10ec585c18b7426cf4dc25" Apr 16 23:56:00.908597 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:00.908357 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596kcc" Apr 16 23:56:12.496328 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496295 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg"] Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496605 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="pull" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496616 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="pull" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496626 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="util" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496632 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="util" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496641 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="extract" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496648 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="extract" Apr 16 23:56:12.496735 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.496696 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="795cd15a-38bd-487b-bc2f-fa63af3671f6" containerName="extract" Apr 16 23:56:12.503717 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.503698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.507585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.507561 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:56:12.507790 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.507774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:56:12.507861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.507847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:56:12.507975 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.507959 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-nnn2n\"" Apr 16 23:56:12.508035 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.508005 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:56:12.515157 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.515138 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg"] Apr 16 23:56:12.608231 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.608205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.608325 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.608258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7bdd\" (UniqueName: \"kubernetes.io/projected/e9fe26de-678e-4de8-902d-2994a78acd38-kube-api-access-j7bdd\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.608325 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.608317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.709226 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.709202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7bdd\" (UniqueName: \"kubernetes.io/projected/e9fe26de-678e-4de8-902d-2994a78acd38-kube-api-access-j7bdd\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.709320 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.709244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.709385 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.709366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.711733 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.711708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.711816 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.711736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fe26de-678e-4de8-902d-2994a78acd38-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.717341 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.717323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7bdd\" (UniqueName: \"kubernetes.io/projected/e9fe26de-678e-4de8-902d-2994a78acd38-kube-api-access-j7bdd\") pod \"opendatahub-operator-controller-manager-bf54d8685-s6vgg\" (UID: \"e9fe26de-678e-4de8-902d-2994a78acd38\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.814190 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.814164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:12.830807 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.830783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt"] Apr 16 23:56:12.836222 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.836199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:12.838774 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.838752 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:56:12.838942 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.838922 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:56:12.839032 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.838928 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:56:12.844616 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.844597 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt"] Apr 16 23:56:12.911123 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.911090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:12.911255 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.911142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:12.911255 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.911163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5n8\" (UniqueName: \"kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:12.941323 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:12.941298 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg"] Apr 16 23:56:13.012277 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.012244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.012386 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.012282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.012386 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.012301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5n8\" (UniqueName: \"kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.012707 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.012683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.012749 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.012694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.019455 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.019433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5n8\" (UniqueName: \"kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.147256 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.147199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:13.265797 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.265768 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt"] Apr 16 23:56:13.267883 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:56:13.267851 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3241deaa_e6ad_45c7_88f7_b8aea8c06563.slice/crio-56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5 WatchSource:0}: Error finding container 56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5: Status 404 returned error can't find the container with id 56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5 Apr 16 23:56:13.959636 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.959598 2578 generic.go:358] "Generic (PLEG): container finished" podID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerID="eeb63c90767f0bfd3f6d78c8ed1cfb69c829937a9c32cdb3bb2ddf29163ec2b3" exitCode=0 Apr 16 23:56:13.960098 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.959694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" event={"ID":"3241deaa-e6ad-45c7-88f7-b8aea8c06563","Type":"ContainerDied","Data":"eeb63c90767f0bfd3f6d78c8ed1cfb69c829937a9c32cdb3bb2ddf29163ec2b3"} Apr 16 23:56:13.960098 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.959724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" event={"ID":"3241deaa-e6ad-45c7-88f7-b8aea8c06563","Type":"ContainerStarted","Data":"56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5"} Apr 16 23:56:13.961583 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:13.961551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" event={"ID":"e9fe26de-678e-4de8-902d-2994a78acd38","Type":"ContainerStarted","Data":"3846bda27d1a61fb87abb49c528a6d37d34d52ae298185e55602f2ef13a3f5d2"} Apr 16 23:56:15.971031 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:15.970955 2578 generic.go:358] "Generic (PLEG): container finished" podID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerID="46bdff88603b6292e317ad33a5f51ca45aee87fd6c5e69bd00b87fd91bb27e68" exitCode=0 Apr 16 23:56:15.971031 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:15.971027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" event={"ID":"3241deaa-e6ad-45c7-88f7-b8aea8c06563","Type":"ContainerDied","Data":"46bdff88603b6292e317ad33a5f51ca45aee87fd6c5e69bd00b87fd91bb27e68"} Apr 16 23:56:15.972796 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:15.972713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" event={"ID":"e9fe26de-678e-4de8-902d-2994a78acd38","Type":"ContainerStarted","Data":"7d7ba283d73206a6ccc48a5e65ee1f15af11e82bd12b0245430931cff1fec418"} Apr 16 23:56:15.972881 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:15.972824 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:16.010651 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:16.010606 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" podStartSLOduration=1.637288001 podStartE2EDuration="4.010593314s" podCreationTimestamp="2026-04-16 23:56:12 +0000 UTC" firstStartedPulling="2026-04-16 23:56:12.953894326 +0000 UTC m=+353.728919202" lastFinishedPulling="2026-04-16 23:56:15.327199647 +0000 UTC m=+356.102224515" observedRunningTime="2026-04-16 23:56:16.008757023 +0000 UTC m=+356.783781921" watchObservedRunningTime="2026-04-16 23:56:16.010593314 +0000 UTC m=+356.785618200" Apr 16 23:56:16.977841 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:16.977809 2578 generic.go:358] "Generic (PLEG): container finished" podID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerID="03bbe27e291c97de70343d3b90534f701fce94bd9c7d2ba56cd06185abf2741f" exitCode=0 Apr 16 23:56:16.978183 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:16.977895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" event={"ID":"3241deaa-e6ad-45c7-88f7-b8aea8c06563","Type":"ContainerDied","Data":"03bbe27e291c97de70343d3b90534f701fce94bd9c7d2ba56cd06185abf2741f"} Apr 16 23:56:18.100820 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.100796 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:18.158985 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.158959 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle\") pod \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " Apr 16 23:56:18.159122 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.159010 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz5n8\" (UniqueName: \"kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8\") pod \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " Apr 16 23:56:18.159122 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.159061 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util\") pod \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\" (UID: \"3241deaa-e6ad-45c7-88f7-b8aea8c06563\") " Apr 16 23:56:18.159731 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.159707 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle" (OuterVolumeSpecName: "bundle") pod "3241deaa-e6ad-45c7-88f7-b8aea8c06563" (UID: "3241deaa-e6ad-45c7-88f7-b8aea8c06563"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:18.161043 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.161025 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8" (OuterVolumeSpecName: "kube-api-access-gz5n8") pod "3241deaa-e6ad-45c7-88f7-b8aea8c06563" (UID: "3241deaa-e6ad-45c7-88f7-b8aea8c06563"). InnerVolumeSpecName "kube-api-access-gz5n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:56:18.164452 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.164416 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util" (OuterVolumeSpecName: "util") pod "3241deaa-e6ad-45c7-88f7-b8aea8c06563" (UID: "3241deaa-e6ad-45c7-88f7-b8aea8c06563"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:18.259793 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.259772 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:18.259793 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.259794 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3241deaa-e6ad-45c7-88f7-b8aea8c06563-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:18.259906 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.259804 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gz5n8\" (UniqueName: \"kubernetes.io/projected/3241deaa-e6ad-45c7-88f7-b8aea8c06563-kube-api-access-gz5n8\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:18.985437 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.985409 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" Apr 16 23:56:18.985437 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.985419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9hh5gt" event={"ID":"3241deaa-e6ad-45c7-88f7-b8aea8c06563","Type":"ContainerDied","Data":"56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5"} Apr 16 23:56:18.985672 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:18.985459 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56917ff5d59dbaba210923013ad94ce64a98f3ce6912382145b1da17881113a5" Apr 16 23:56:26.980613 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:26.980583 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-s6vgg" Apr 16 23:56:29.155681 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.155621 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc"] Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.155978 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="util" Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.155988 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="util" Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.156013 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="pull" Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.156018 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="pull" Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.156026 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="extract" Apr 16 23:56:29.156040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.156031 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="extract" Apr 16 23:56:29.156224 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.156076 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3241deaa-e6ad-45c7-88f7-b8aea8c06563" containerName="extract" Apr 16 23:56:29.160327 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.160303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.163387 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.163365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 23:56:29.163499 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.163365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 23:56:29.164675 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.164655 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wl95d\"" Apr 16 23:56:29.164832 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.164662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 23:56:29.164982 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.164713 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:56:29.165091 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.164827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 23:56:29.172700 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.172668 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc"] Apr 16 23:56:29.238884 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.238856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.239037 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.238909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cgj\" (UniqueName: \"kubernetes.io/projected/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-kube-api-access-p4cgj\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.239037 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.238929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-manager-config\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.239037 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.238953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.339980 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.339941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cgj\" (UniqueName: \"kubernetes.io/projected/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-kube-api-access-p4cgj\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.339980 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.339989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-manager-config\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.340218 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.340033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.340218 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.340116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.340792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.340762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-manager-config\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.342720 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.342688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.342812 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.342725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.358080 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.358058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cgj\" (UniqueName: \"kubernetes.io/projected/27662c6d-0432-45b9-82a9-1c1c2dbd3fa5-kube-api-access-p4cgj\") pod \"lws-controller-manager-6b6988ccb7-wtdpc\" (UID: \"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.472284 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.472197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:29.594199 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:29.594176 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc"] Apr 16 23:56:29.596303 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:56:29.596271 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27662c6d_0432_45b9_82a9_1c1c2dbd3fa5.slice/crio-247af355b238cd9b1fd85320022cd5736f56e67b136d8608c9c09a3fd5d30b5d WatchSource:0}: Error finding container 247af355b238cd9b1fd85320022cd5736f56e67b136d8608c9c09a3fd5d30b5d: Status 404 returned error can't find the container with id 247af355b238cd9b1fd85320022cd5736f56e67b136d8608c9c09a3fd5d30b5d Apr 16 23:56:30.026710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:30.026673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" event={"ID":"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5","Type":"ContainerStarted","Data":"247af355b238cd9b1fd85320022cd5736f56e67b136d8608c9c09a3fd5d30b5d"} Apr 16 23:56:32.035716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:32.035671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" event={"ID":"27662c6d-0432-45b9-82a9-1c1c2dbd3fa5","Type":"ContainerStarted","Data":"8f625d9060374dab862dc850b11ae0a0f1ed5dd7ed421b9635bb4a5b0edb411a"} Apr 16 23:56:32.036141 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:32.035728 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:32.050779 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:32.050731 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" podStartSLOduration=1.6209599460000002 podStartE2EDuration="3.050717503s" podCreationTimestamp="2026-04-16 23:56:29 +0000 UTC" firstStartedPulling="2026-04-16 23:56:29.598015397 +0000 UTC m=+370.373040262" lastFinishedPulling="2026-04-16 23:56:31.02777294 +0000 UTC m=+371.802797819" observedRunningTime="2026-04-16 23:56:32.049699107 +0000 UTC m=+372.824723993" watchObservedRunningTime="2026-04-16 23:56:32.050717503 +0000 UTC m=+372.825742388" Apr 16 23:56:41.050027 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.049993 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr"] Apr 16 23:56:41.053672 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.053654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.056592 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.056565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:56:41.056782 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.056762 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:56:41.057663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.057640 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:56:41.065173 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.065151 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr"] Apr 16 23:56:41.137875 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.137850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.138002 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.137908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzk9v\" (UniqueName: \"kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.138047 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.137996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.238352 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.238322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.238476 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.238371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.238476 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.238425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzk9v\" (UniqueName: \"kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.238730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.238711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.238772 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.238738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.246155 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.246123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzk9v\" (UniqueName: \"kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.363386 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.363330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:41.690206 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:41.690124 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr"] Apr 16 23:56:41.694658 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:56:41.694631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49e1042_fbb4_4c9e_893d_4ccdb76423d4.slice/crio-782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb WatchSource:0}: Error finding container 782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb: Status 404 returned error can't find the container with id 782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb Apr 16 23:56:42.070271 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:42.070234 2578 generic.go:358] "Generic (PLEG): container finished" podID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerID="d4ae31da68e9f8acd62f767e64145bcd49546cd5d5997be7d4c5049e4a81d36d" exitCode=0 Apr 16 23:56:42.070710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:42.070328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" event={"ID":"b49e1042-fbb4-4c9e-893d-4ccdb76423d4","Type":"ContainerDied","Data":"d4ae31da68e9f8acd62f767e64145bcd49546cd5d5997be7d4c5049e4a81d36d"} Apr 16 23:56:42.070710 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:42.070361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" event={"ID":"b49e1042-fbb4-4c9e-893d-4ccdb76423d4","Type":"ContainerStarted","Data":"782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb"} Apr 16 23:56:43.041480 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:43.041453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-wtdpc" Apr 16 23:56:43.076002 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:43.075970 2578 generic.go:358] "Generic (PLEG): container finished" podID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerID="f4037a2db46ce43c93c2f63743b3684097295628cd08a80a668d046e79a9a9de" exitCode=0 Apr 16 23:56:43.076366 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:43.076019 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" event={"ID":"b49e1042-fbb4-4c9e-893d-4ccdb76423d4","Type":"ContainerDied","Data":"f4037a2db46ce43c93c2f63743b3684097295628cd08a80a668d046e79a9a9de"} Apr 16 23:56:44.080788 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:44.080746 2578 generic.go:358] "Generic (PLEG): container finished" podID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerID="bcc30e588ceb9e285484d54d95b829e1bcf53dfe7ddd8d6d2e22c9f43297428f" exitCode=0 Apr 16 23:56:44.081225 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:44.080823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" event={"ID":"b49e1042-fbb4-4c9e-893d-4ccdb76423d4","Type":"ContainerDied","Data":"bcc30e588ceb9e285484d54d95b829e1bcf53dfe7ddd8d6d2e22c9f43297428f"} Apr 16 23:56:45.207804 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.207780 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:45.371440 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.371353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzk9v\" (UniqueName: \"kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v\") pod \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " Apr 16 23:56:45.371440 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.371394 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util\") pod \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " Apr 16 23:56:45.371440 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.371424 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle\") pod \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\" (UID: \"b49e1042-fbb4-4c9e-893d-4ccdb76423d4\") " Apr 16 23:56:45.372272 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.372238 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle" (OuterVolumeSpecName: "bundle") pod "b49e1042-fbb4-4c9e-893d-4ccdb76423d4" (UID: "b49e1042-fbb4-4c9e-893d-4ccdb76423d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:45.373312 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.373290 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v" (OuterVolumeSpecName: "kube-api-access-pzk9v") pod "b49e1042-fbb4-4c9e-893d-4ccdb76423d4" (UID: "b49e1042-fbb4-4c9e-893d-4ccdb76423d4"). InnerVolumeSpecName "kube-api-access-pzk9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:56:45.378926 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.378902 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util" (OuterVolumeSpecName: "util") pod "b49e1042-fbb4-4c9e-893d-4ccdb76423d4" (UID: "b49e1042-fbb4-4c9e-893d-4ccdb76423d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:45.471966 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.471940 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:45.471966 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.471963 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzk9v\" (UniqueName: \"kubernetes.io/projected/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-kube-api-access-pzk9v\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:45.472106 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:45.471974 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b49e1042-fbb4-4c9e-893d-4ccdb76423d4-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:46.089243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:46.089162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" event={"ID":"b49e1042-fbb4-4c9e-893d-4ccdb76423d4","Type":"ContainerDied","Data":"782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb"} Apr 16 23:56:46.089243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:46.089198 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="782c18c0f0deaa9fd2ebb75c92079118cd4a002b5447a107032768b450728bfb" Apr 16 23:56:46.089243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:46.089218 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835rkjmr" Apr 16 23:56:54.674855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.674818 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8"] Apr 16 23:56:54.675313 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675290 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="pull" Apr 16 23:56:54.675313 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675309 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="pull" Apr 16 23:56:54.675443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675331 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="util" Apr 16 23:56:54.675443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675341 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="util" Apr 16 23:56:54.675443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675354 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="extract" Apr 16 23:56:54.675443 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675363 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="extract" Apr 16 23:56:54.675668 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.675460 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49e1042-fbb4-4c9e-893d-4ccdb76423d4" containerName="extract" Apr 16 23:56:54.685164 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.685129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.687990 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.687966 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:56:54.688098 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.688029 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wp595\"" Apr 16 23:56:54.688145 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.688103 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8"] Apr 16 23:56:54.689002 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.688972 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:56:54.842178 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.842147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.842178 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.842177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.842410 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.842252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvk5z\" (UniqueName: \"kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.942933 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.942847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvk5z\" (UniqueName: \"kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.942933 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.942925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.943145 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.942958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.943309 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.943289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.943405 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.943381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.954915 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.954894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvk5z\" (UniqueName: \"kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:54.995499 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:54.995472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:55.133571 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:55.133495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8"] Apr 16 23:56:55.136128 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:56:55.136100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2a5825_7538_43e4_a21f_42321d1e34b0.slice/crio-1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b WatchSource:0}: Error finding container 1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b: Status 404 returned error can't find the container with id 1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b Apr 16 23:56:56.126606 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:56.126576 2578 generic.go:358] "Generic (PLEG): container finished" podID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerID="06746b3b2e12230d27300657506eddd5760868db62057c2055182f63617932fb" exitCode=0 Apr 16 23:56:56.127029 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:56.126664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" event={"ID":"2c2a5825-7538-43e4-a21f-42321d1e34b0","Type":"ContainerDied","Data":"06746b3b2e12230d27300657506eddd5760868db62057c2055182f63617932fb"} Apr 16 23:56:56.127029 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:56.126708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" event={"ID":"2c2a5825-7538-43e4-a21f-42321d1e34b0","Type":"ContainerStarted","Data":"1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b"} Apr 16 23:56:57.132616 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:57.132510 2578 generic.go:358] "Generic (PLEG): container finished" podID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerID="39fcd011ecf59fe7002a2a5033288c2e4a7a96d878978b21debb12723b011311" exitCode=0 Apr 16 23:56:57.132616 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:57.132590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" event={"ID":"2c2a5825-7538-43e4-a21f-42321d1e34b0","Type":"ContainerDied","Data":"39fcd011ecf59fe7002a2a5033288c2e4a7a96d878978b21debb12723b011311"} Apr 16 23:56:58.137597 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:58.137562 2578 generic.go:358] "Generic (PLEG): container finished" podID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerID="4c78c3a9b9293cd420fdf64524e40bae84963c713a04ce743c34f0a970a09950" exitCode=0 Apr 16 23:56:58.138044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:58.137640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" event={"ID":"2c2a5825-7538-43e4-a21f-42321d1e34b0","Type":"ContainerDied","Data":"4c78c3a9b9293cd420fdf64524e40bae84963c713a04ce743c34f0a970a09950"} Apr 16 23:56:59.255533 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.255508 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:56:59.379861 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.379829 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle\") pod \"2c2a5825-7538-43e4-a21f-42321d1e34b0\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " Apr 16 23:56:59.380023 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.379887 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util\") pod \"2c2a5825-7538-43e4-a21f-42321d1e34b0\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " Apr 16 23:56:59.380023 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.379918 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvk5z\" (UniqueName: \"kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z\") pod \"2c2a5825-7538-43e4-a21f-42321d1e34b0\" (UID: \"2c2a5825-7538-43e4-a21f-42321d1e34b0\") " Apr 16 23:56:59.380737 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.380702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle" (OuterVolumeSpecName: "bundle") pod "2c2a5825-7538-43e4-a21f-42321d1e34b0" (UID: "2c2a5825-7538-43e4-a21f-42321d1e34b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:59.381979 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.381956 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z" (OuterVolumeSpecName: "kube-api-access-gvk5z") pod "2c2a5825-7538-43e4-a21f-42321d1e34b0" (UID: "2c2a5825-7538-43e4-a21f-42321d1e34b0"). InnerVolumeSpecName "kube-api-access-gvk5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:56:59.385188 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.385156 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util" (OuterVolumeSpecName: "util") pod "2c2a5825-7538-43e4-a21f-42321d1e34b0" (UID: "2c2a5825-7538-43e4-a21f-42321d1e34b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:56:59.486044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.485967 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:59.486044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.485996 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c2a5825-7538-43e4-a21f-42321d1e34b0-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:56:59.486044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:56:59.486010 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvk5z\" (UniqueName: \"kubernetes.io/projected/2c2a5825-7538-43e4-a21f-42321d1e34b0-kube-api-access-gvk5z\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:00.146427 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:00.146340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" event={"ID":"2c2a5825-7538-43e4-a21f-42321d1e34b0","Type":"ContainerDied","Data":"1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b"} Apr 16 23:57:00.146427 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:00.146372 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hh7r8" Apr 16 23:57:00.146633 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:00.146377 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c89f76fe27c3b22177af190ee2c9a50031550303136b1997555ada47078332b" Apr 16 23:57:17.319934 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.319899 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl"] Apr 16 23:57:17.320890 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320866 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="pull" Apr 16 23:57:17.320890 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320889 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="pull" Apr 16 23:57:17.321053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320921 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="util" Apr 16 23:57:17.321053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320929 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="util" Apr 16 23:57:17.321053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320938 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="extract" Apr 16 23:57:17.321053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.320946 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="extract" Apr 16 23:57:17.321053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.321033 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c2a5825-7538-43e4-a21f-42321d1e34b0" containerName="extract" Apr 16 23:57:17.325123 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.325103 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.327749 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.327725 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 23:57:17.327863 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.327764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-mdlrt\"" Apr 16 23:57:17.335640 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.335617 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl"] Apr 16 23:57:17.420843 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbb6n\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-kube-api-access-vbb6n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.420993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.420993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.420993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.420993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.420993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.420983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.421195 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.421009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.421195 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.421081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.421195 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.421098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521520 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521520 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbb6n\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-kube-api-access-vbb6n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.521769 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.521830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522331 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.522308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522497 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.522432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522497 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.522476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522658 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.522492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.522658 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.522509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.524340 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.524320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.524692 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.524669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.542374 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.542350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.542506 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.542488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbb6n\" (UniqueName: \"kubernetes.io/projected/fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565-kube-api-access-vbb6n\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl\" (UID: \"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.636727 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.636659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:17.758635 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:17.758605 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl"] Apr 16 23:57:17.761098 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:17.761069 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef97fb8_ba0f_4dbc_acf3_42c5bcdf9565.slice/crio-a423fa1f7e09b9c1a2cfaed0e60da90e159b8e16b7ac08bcd124596ae6f65db2 WatchSource:0}: Error finding container a423fa1f7e09b9c1a2cfaed0e60da90e159b8e16b7ac08bcd124596ae6f65db2: Status 404 returned error can't find the container with id a423fa1f7e09b9c1a2cfaed0e60da90e159b8e16b7ac08bcd124596ae6f65db2 Apr 16 23:57:18.210702 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:18.210668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" event={"ID":"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565","Type":"ContainerStarted","Data":"a423fa1f7e09b9c1a2cfaed0e60da90e159b8e16b7ac08bcd124596ae6f65db2"} Apr 16 23:57:20.185858 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:20.185810 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:57:20.186137 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:20.185893 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:57:20.186137 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:20.185921 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:57:21.222989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:21.222946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" event={"ID":"fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565","Type":"ContainerStarted","Data":"efd5a3bf5a80e2f5e3659dc590323cdadcf0b9bf0b82807f17c3c70888a52170"} Apr 16 23:57:21.242527 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:21.242481 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" podStartSLOduration=1.8198716350000002 podStartE2EDuration="4.242469007s" podCreationTimestamp="2026-04-16 23:57:17 +0000 UTC" firstStartedPulling="2026-04-16 23:57:17.762924765 +0000 UTC m=+418.537949629" lastFinishedPulling="2026-04-16 23:57:20.18552212 +0000 UTC m=+420.960547001" observedRunningTime="2026-04-16 23:57:21.240246923 +0000 UTC m=+422.015271809" watchObservedRunningTime="2026-04-16 23:57:21.242469007 +0000 UTC m=+422.017493893" Apr 16 23:57:21.637427 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:21.637397 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:21.641929 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:21.641908 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:22.227177 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:22.227136 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:22.228123 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:22.228105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl" Apr 16 23:57:29.956234 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.956162 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:29.959635 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.959615 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:29.962071 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.962047 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:57:29.962185 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.962163 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-swfxf\"" Apr 16 23:57:29.962299 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.962282 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:57:29.965856 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:29.965834 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:30.019260 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.019225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkm4\" (UniqueName: \"kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4\") pod \"kuadrant-operator-catalog-x289p\" (UID: \"165f271c-6a3d-4322-b35e-92c4aa6f4de6\") " pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:30.119858 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.119826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flkm4\" (UniqueName: \"kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4\") pod \"kuadrant-operator-catalog-x289p\" (UID: \"165f271c-6a3d-4322-b35e-92c4aa6f4de6\") " pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:30.127523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.127493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkm4\" (UniqueName: \"kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4\") pod \"kuadrant-operator-catalog-x289p\" (UID: \"165f271c-6a3d-4322-b35e-92c4aa6f4de6\") " pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:30.270989 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.270962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:30.328274 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.327273 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:30.411811 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.411789 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:30.413572 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:30.413518 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165f271c_6a3d_4322_b35e_92c4aa6f4de6.slice/crio-eb19d755f30f893ee70ec30f5248027a45085a4a00e7426e84469f52ddc39d7f WatchSource:0}: Error finding container eb19d755f30f893ee70ec30f5248027a45085a4a00e7426e84469f52ddc39d7f: Status 404 returned error can't find the container with id eb19d755f30f893ee70ec30f5248027a45085a4a00e7426e84469f52ddc39d7f Apr 16 23:57:30.527873 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.527807 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kkrrx"] Apr 16 23:57:30.532247 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.532227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:30.538062 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.538043 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kkrrx"] Apr 16 23:57:30.631180 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.631148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfb8\" (UniqueName: \"kubernetes.io/projected/0bd81371-b60d-4829-a5c0-c9284e76ace1-kube-api-access-dhfb8\") pod \"kuadrant-operator-catalog-kkrrx\" (UID: \"0bd81371-b60d-4829-a5c0-c9284e76ace1\") " pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:30.731659 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.731635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfb8\" (UniqueName: \"kubernetes.io/projected/0bd81371-b60d-4829-a5c0-c9284e76ace1-kube-api-access-dhfb8\") pod \"kuadrant-operator-catalog-kkrrx\" (UID: \"0bd81371-b60d-4829-a5c0-c9284e76ace1\") " pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:30.739277 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.739256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfb8\" (UniqueName: \"kubernetes.io/projected/0bd81371-b60d-4829-a5c0-c9284e76ace1-kube-api-access-dhfb8\") pod \"kuadrant-operator-catalog-kkrrx\" (UID: \"0bd81371-b60d-4829-a5c0-c9284e76ace1\") " pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:30.843463 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.843407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:30.963155 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:30.963122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kkrrx"] Apr 16 23:57:30.965120 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:30.965091 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd81371_b60d_4829_a5c0_c9284e76ace1.slice/crio-14373fcaeec74b020fb45cfd3e9e3a8d6b7bcdf953990c2ca7338492dbb0fbd4 WatchSource:0}: Error finding container 14373fcaeec74b020fb45cfd3e9e3a8d6b7bcdf953990c2ca7338492dbb0fbd4: Status 404 returned error can't find the container with id 14373fcaeec74b020fb45cfd3e9e3a8d6b7bcdf953990c2ca7338492dbb0fbd4 Apr 16 23:57:31.258397 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:31.258365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" event={"ID":"0bd81371-b60d-4829-a5c0-c9284e76ace1","Type":"ContainerStarted","Data":"14373fcaeec74b020fb45cfd3e9e3a8d6b7bcdf953990c2ca7338492dbb0fbd4"} Apr 16 23:57:31.259489 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:31.259458 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x289p" event={"ID":"165f271c-6a3d-4322-b35e-92c4aa6f4de6","Type":"ContainerStarted","Data":"eb19d755f30f893ee70ec30f5248027a45085a4a00e7426e84469f52ddc39d7f"} Apr 16 23:57:33.268330 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.268292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" event={"ID":"0bd81371-b60d-4829-a5c0-c9284e76ace1","Type":"ContainerStarted","Data":"4db1a6796e9aa7a2280727c8e3c6da13a5aee911c2fc9a1d7ff69b81620de376"} Apr 16 23:57:33.269632 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.269603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x289p" event={"ID":"165f271c-6a3d-4322-b35e-92c4aa6f4de6","Type":"ContainerStarted","Data":"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6"} Apr 16 23:57:33.269756 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.269665 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-x289p" podUID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" containerName="registry-server" containerID="cri-o://01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6" gracePeriod=2 Apr 16 23:57:33.283326 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.283285 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" podStartSLOduration=1.69509189 podStartE2EDuration="3.283274594s" podCreationTimestamp="2026-04-16 23:57:30 +0000 UTC" firstStartedPulling="2026-04-16 23:57:30.966416315 +0000 UTC m=+431.741441179" lastFinishedPulling="2026-04-16 23:57:32.554599019 +0000 UTC m=+433.329623883" observedRunningTime="2026-04-16 23:57:33.28135446 +0000 UTC m=+434.056379347" watchObservedRunningTime="2026-04-16 23:57:33.283274594 +0000 UTC m=+434.058299508" Apr 16 23:57:33.294371 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.294333 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-x289p" podStartSLOduration=2.157323164 podStartE2EDuration="4.294321365s" podCreationTimestamp="2026-04-16 23:57:29 +0000 UTC" firstStartedPulling="2026-04-16 23:57:30.414815855 +0000 UTC m=+431.189840719" lastFinishedPulling="2026-04-16 23:57:32.551814051 +0000 UTC m=+433.326838920" observedRunningTime="2026-04-16 23:57:33.293901272 +0000 UTC m=+434.068926152" watchObservedRunningTime="2026-04-16 23:57:33.294321365 +0000 UTC m=+434.069346251" Apr 16 23:57:33.510239 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.510216 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:33.558528 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.558464 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flkm4\" (UniqueName: \"kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4\") pod \"165f271c-6a3d-4322-b35e-92c4aa6f4de6\" (UID: \"165f271c-6a3d-4322-b35e-92c4aa6f4de6\") " Apr 16 23:57:33.560517 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.560498 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4" (OuterVolumeSpecName: "kube-api-access-flkm4") pod "165f271c-6a3d-4322-b35e-92c4aa6f4de6" (UID: "165f271c-6a3d-4322-b35e-92c4aa6f4de6"). InnerVolumeSpecName "kube-api-access-flkm4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:57:33.660043 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:33.660012 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-flkm4\" (UniqueName: \"kubernetes.io/projected/165f271c-6a3d-4322-b35e-92c4aa6f4de6-kube-api-access-flkm4\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:34.274127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.274090 2578 generic.go:358] "Generic (PLEG): container finished" podID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" containerID="01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6" exitCode=0 Apr 16 23:57:34.274578 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.274136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x289p" event={"ID":"165f271c-6a3d-4322-b35e-92c4aa6f4de6","Type":"ContainerDied","Data":"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6"} Apr 16 23:57:34.274578 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.274152 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x289p" Apr 16 23:57:34.274578 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.274170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x289p" event={"ID":"165f271c-6a3d-4322-b35e-92c4aa6f4de6","Type":"ContainerDied","Data":"eb19d755f30f893ee70ec30f5248027a45085a4a00e7426e84469f52ddc39d7f"} Apr 16 23:57:34.274578 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.274193 2578 scope.go:117] "RemoveContainer" containerID="01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6" Apr 16 23:57:34.283163 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.283145 2578 scope.go:117] "RemoveContainer" containerID="01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6" Apr 16 23:57:34.283398 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:57:34.283378 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6\": container with ID starting with 01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6 not found: ID does not exist" containerID="01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6" Apr 16 23:57:34.283444 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.283406 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6"} err="failed to get container status \"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6\": rpc error: code = NotFound desc = could not find container \"01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6\": container with ID starting with 01e259716911044e9da7abc48544db5af924f87e5639b5e7e41b3b81b8026dc6 not found: ID does not exist" Apr 16 23:57:34.290789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.290765 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:34.293236 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:34.293215 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x289p"] Apr 16 23:57:35.818106 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:35.818073 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" path="/var/lib/kubelet/pods/165f271c-6a3d-4322-b35e-92c4aa6f4de6/volumes" Apr 16 23:57:40.843966 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:40.843921 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:40.843966 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:40.843976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:40.865142 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:40.865119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:41.321316 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:41.321288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-kkrrx" Apr 16 23:57:45.107354 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.107320 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p"] Apr 16 23:57:45.107718 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.107708 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" containerName="registry-server" Apr 16 23:57:45.107759 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.107721 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" containerName="registry-server" Apr 16 23:57:45.107797 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.107787 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="165f271c-6a3d-4322-b35e-92c4aa6f4de6" containerName="registry-server" Apr 16 23:57:45.112139 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.112121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.114601 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.114576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-skvbv\"" Apr 16 23:57:45.118382 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.118359 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p"] Apr 16 23:57:45.251418 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.251386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlkvt\" (UniqueName: \"kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.251593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.251453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.251593 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.251478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.351988 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.351956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.351988 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.351990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.352293 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.352081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlkvt\" (UniqueName: \"kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.352364 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.352326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.352414 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.352391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.359587 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.359520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlkvt\" (UniqueName: \"kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.421826 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.421801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:45.537725 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.537698 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p"] Apr 16 23:57:45.539685 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:45.539654 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982e434d_93b4_4c88_bb51_fe62c1113690.slice/crio-d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479 WatchSource:0}: Error finding container d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479: Status 404 returned error can't find the container with id d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479 Apr 16 23:57:45.908563 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.908463 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm"] Apr 16 23:57:45.911941 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.911926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:45.919270 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:45.919247 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm"] Apr 16 23:57:46.058197 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.058166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.058333 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.058214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.058333 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.058305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxvz\" (UniqueName: \"kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.159067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.159006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxvz\" (UniqueName: \"kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.159067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.159062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.159408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.159097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.159442 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.159403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.159478 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.159437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.166687 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.166669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxvz\" (UniqueName: \"kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.221839 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.221821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:46.319093 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.319024 2578 generic.go:358] "Generic (PLEG): container finished" podID="982e434d-93b4-4c88-bb51-fe62c1113690" containerID="44bdae438bffd3b7261732a81dbf14f5e909ac40d8ff3b6b303d8a355fbf0e7a" exitCode=0 Apr 16 23:57:46.319093 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.319066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" event={"ID":"982e434d-93b4-4c88-bb51-fe62c1113690","Type":"ContainerDied","Data":"44bdae438bffd3b7261732a81dbf14f5e909ac40d8ff3b6b303d8a355fbf0e7a"} Apr 16 23:57:46.319301 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.319108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" event={"ID":"982e434d-93b4-4c88-bb51-fe62c1113690","Type":"ContainerStarted","Data":"d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479"} Apr 16 23:57:46.545214 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.545187 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm"] Apr 16 23:57:46.547081 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:46.547056 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0bb47e_951d_4aff_8f6d_836c69c46190.slice/crio-348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29 WatchSource:0}: Error finding container 348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29: Status 404 returned error can't find the container with id 348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29 Apr 16 23:57:46.706139 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.706112 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk"] Apr 16 23:57:46.709446 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.709432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.717663 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.717641 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk"] Apr 16 23:57:46.864346 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.864280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.864346 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.864321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7jjj\" (UniqueName: \"kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.864523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.864412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.964820 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.964792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.964962 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.964869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.964962 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.964912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7jjj\" (UniqueName: \"kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.965248 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.965229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.965302 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.965280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:46.973360 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:46.973335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7jjj\" (UniqueName: \"kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:47.019167 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.019143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:47.104217 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.104117 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g"] Apr 16 23:57:47.109378 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.109361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.115733 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.115659 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g"] Apr 16 23:57:47.170150 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.170126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk"] Apr 16 23:57:47.200199 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:47.200174 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d8b191d_1c87_4735_9354_a90a59a1b45b.slice/crio-19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9 WatchSource:0}: Error finding container 19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9: Status 404 returned error can't find the container with id 19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9 Apr 16 23:57:47.267917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.267895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.268011 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.267930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.268011 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.267952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8g4t\" (UniqueName: \"kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.323484 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.323459 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerID="8f64447dd1776e3898a34d23d2ec6601aa298fe988b56eaf7989b4762385e3d7" exitCode=0 Apr 16 23:57:47.323585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.323532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" event={"ID":"6d8b191d-1c87-4735-9354-a90a59a1b45b","Type":"ContainerDied","Data":"8f64447dd1776e3898a34d23d2ec6601aa298fe988b56eaf7989b4762385e3d7"} Apr 16 23:57:47.323585 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.323577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" event={"ID":"6d8b191d-1c87-4735-9354-a90a59a1b45b","Type":"ContainerStarted","Data":"19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9"} Apr 16 23:57:47.325067 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.325047 2578 generic.go:358] "Generic (PLEG): container finished" podID="982e434d-93b4-4c88-bb51-fe62c1113690" containerID="54a86ec58694d8036cec3fcdb2a9be74e85abac8516ceea7b75a65d19bf36084" exitCode=0 Apr 16 23:57:47.325170 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.325127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" event={"ID":"982e434d-93b4-4c88-bb51-fe62c1113690","Type":"ContainerDied","Data":"54a86ec58694d8036cec3fcdb2a9be74e85abac8516ceea7b75a65d19bf36084"} Apr 16 23:57:47.326601 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.326580 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerID="74ba3a4f5099b64404d7809435bf3a29cbdb85c896bbdfdbe00ec9613ab6f42a" exitCode=0 Apr 16 23:57:47.326689 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.326630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" event={"ID":"fe0bb47e-951d-4aff-8f6d-836c69c46190","Type":"ContainerDied","Data":"74ba3a4f5099b64404d7809435bf3a29cbdb85c896bbdfdbe00ec9613ab6f42a"} Apr 16 23:57:47.326689 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.326652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" event={"ID":"fe0bb47e-951d-4aff-8f6d-836c69c46190","Type":"ContainerStarted","Data":"348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29"} Apr 16 23:57:47.368703 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.368645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.368703 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.368694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.368869 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.368714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8g4t\" (UniqueName: \"kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.369112 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.369084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.369112 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.369098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.377051 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.377029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8g4t\" (UniqueName: \"kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.420956 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.420929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:47.541101 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:47.541077 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g"] Apr 16 23:57:47.543341 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:57:47.543315 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda81de8_7dbc_4bdb_8784_9999433285d0.slice/crio-26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9 WatchSource:0}: Error finding container 26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9: Status 404 returned error can't find the container with id 26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9 Apr 16 23:57:48.332278 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.332204 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerID="7afb355ed243dd58e914d43ba528a384ca21c277f8a041eb66013bdb35b9b811" exitCode=0 Apr 16 23:57:48.332645 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.332327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" event={"ID":"fe0bb47e-951d-4aff-8f6d-836c69c46190","Type":"ContainerDied","Data":"7afb355ed243dd58e914d43ba528a384ca21c277f8a041eb66013bdb35b9b811"} Apr 16 23:57:48.333691 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.333667 2578 generic.go:358] "Generic (PLEG): container finished" podID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerID="15a6dbe0707b241c74386ecb770f5a8256ca1bb1069bf2ccd0dc0008c80fb15f" exitCode=0 Apr 16 23:57:48.333777 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.333711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" event={"ID":"fda81de8-7dbc-4bdb-8784-9999433285d0","Type":"ContainerDied","Data":"15a6dbe0707b241c74386ecb770f5a8256ca1bb1069bf2ccd0dc0008c80fb15f"} Apr 16 23:57:48.333777 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.333745 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" event={"ID":"fda81de8-7dbc-4bdb-8784-9999433285d0","Type":"ContainerStarted","Data":"26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9"} Apr 16 23:57:48.336490 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.335980 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerID="d9dd87594287d50cb387a5a5d2ecedb549bd0f141c94f3228282c98d187ea844" exitCode=0 Apr 16 23:57:48.336490 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.336061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" event={"ID":"6d8b191d-1c87-4735-9354-a90a59a1b45b","Type":"ContainerDied","Data":"d9dd87594287d50cb387a5a5d2ecedb549bd0f141c94f3228282c98d187ea844"} Apr 16 23:57:48.340682 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.340653 2578 generic.go:358] "Generic (PLEG): container finished" podID="982e434d-93b4-4c88-bb51-fe62c1113690" containerID="bc7a77d57a5ff148f0ad45b70a29bc3310dbb4f93533e9f788578d871ae3414b" exitCode=0 Apr 16 23:57:48.340772 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:48.340711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" event={"ID":"982e434d-93b4-4c88-bb51-fe62c1113690","Type":"ContainerDied","Data":"bc7a77d57a5ff148f0ad45b70a29bc3310dbb4f93533e9f788578d871ae3414b"} Apr 16 23:57:49.346072 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.345987 2578 generic.go:358] "Generic (PLEG): container finished" podID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerID="5099682d9dc9c95b6a3d9a5c965ef97c96fb09de27a0a4abfb143702e4649d90" exitCode=0 Apr 16 23:57:49.346432 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.346066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" event={"ID":"fda81de8-7dbc-4bdb-8784-9999433285d0","Type":"ContainerDied","Data":"5099682d9dc9c95b6a3d9a5c965ef97c96fb09de27a0a4abfb143702e4649d90"} Apr 16 23:57:49.348142 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.348118 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerID="2d5bdc9fed8fa2b38343e6c058800d8a626e68788a5ea20a923fad9b302c3d49" exitCode=0 Apr 16 23:57:49.348230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.348199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" event={"ID":"6d8b191d-1c87-4735-9354-a90a59a1b45b","Type":"ContainerDied","Data":"2d5bdc9fed8fa2b38343e6c058800d8a626e68788a5ea20a923fad9b302c3d49"} Apr 16 23:57:49.350107 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.350087 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerID="ea295b16cfcdace56b88ea4cc8d74f7ab7f62cf61de39deeeb284e127dd35ffa" exitCode=0 Apr 16 23:57:49.350181 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.350161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" event={"ID":"fe0bb47e-951d-4aff-8f6d-836c69c46190","Type":"ContainerDied","Data":"ea295b16cfcdace56b88ea4cc8d74f7ab7f62cf61de39deeeb284e127dd35ffa"} Apr 16 23:57:49.474489 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.474466 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:49.586354 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.586313 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle\") pod \"982e434d-93b4-4c88-bb51-fe62c1113690\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " Apr 16 23:57:49.586458 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.586383 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util\") pod \"982e434d-93b4-4c88-bb51-fe62c1113690\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " Apr 16 23:57:49.586458 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.586429 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlkvt\" (UniqueName: \"kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt\") pod \"982e434d-93b4-4c88-bb51-fe62c1113690\" (UID: \"982e434d-93b4-4c88-bb51-fe62c1113690\") " Apr 16 23:57:49.586743 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.586722 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle" (OuterVolumeSpecName: "bundle") pod "982e434d-93b4-4c88-bb51-fe62c1113690" (UID: "982e434d-93b4-4c88-bb51-fe62c1113690"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:49.588388 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.588360 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt" (OuterVolumeSpecName: "kube-api-access-mlkvt") pod "982e434d-93b4-4c88-bb51-fe62c1113690" (UID: "982e434d-93b4-4c88-bb51-fe62c1113690"). InnerVolumeSpecName "kube-api-access-mlkvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:57:49.591250 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.591224 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util" (OuterVolumeSpecName: "util") pod "982e434d-93b4-4c88-bb51-fe62c1113690" (UID: "982e434d-93b4-4c88-bb51-fe62c1113690"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:49.687925 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.687870 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:49.687925 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.687892 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982e434d-93b4-4c88-bb51-fe62c1113690-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:49.687925 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:49.687902 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlkvt\" (UniqueName: \"kubernetes.io/projected/982e434d-93b4-4c88-bb51-fe62c1113690-kube-api-access-mlkvt\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.355369 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.355329 2578 generic.go:358] "Generic (PLEG): container finished" podID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerID="a82ca9b23a7aa3a6fca937707434c3d2a1e101b624a9bf3c6391433f21c31e50" exitCode=0 Apr 16 23:57:50.355781 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.355418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" event={"ID":"fda81de8-7dbc-4bdb-8784-9999433285d0","Type":"ContainerDied","Data":"a82ca9b23a7aa3a6fca937707434c3d2a1e101b624a9bf3c6391433f21c31e50"} Apr 16 23:57:50.357115 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.357092 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" Apr 16 23:57:50.357255 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.357093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p" event={"ID":"982e434d-93b4-4c88-bb51-fe62c1113690","Type":"ContainerDied","Data":"d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479"} Apr 16 23:57:50.357255 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.357196 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50cc6609afb08411739f182d3f8e433d6bac547eae5d7da11969257223c8479" Apr 16 23:57:50.503676 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.503655 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:50.506772 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.506753 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:50.595761 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595732 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtxvz\" (UniqueName: \"kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz\") pod \"fe0bb47e-951d-4aff-8f6d-836c69c46190\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " Apr 16 23:57:50.595910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595785 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util\") pod \"fe0bb47e-951d-4aff-8f6d-836c69c46190\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " Apr 16 23:57:50.595910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595806 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util\") pod \"6d8b191d-1c87-4735-9354-a90a59a1b45b\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " Apr 16 23:57:50.595910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595829 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle\") pod \"fe0bb47e-951d-4aff-8f6d-836c69c46190\" (UID: \"fe0bb47e-951d-4aff-8f6d-836c69c46190\") " Apr 16 23:57:50.595910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595847 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7jjj\" (UniqueName: \"kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj\") pod \"6d8b191d-1c87-4735-9354-a90a59a1b45b\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " Apr 16 23:57:50.595910 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.595879 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle\") pod \"6d8b191d-1c87-4735-9354-a90a59a1b45b\" (UID: \"6d8b191d-1c87-4735-9354-a90a59a1b45b\") " Apr 16 23:57:50.596531 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.596502 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle" (OuterVolumeSpecName: "bundle") pod "6d8b191d-1c87-4735-9354-a90a59a1b45b" (UID: "6d8b191d-1c87-4735-9354-a90a59a1b45b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:50.596645 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.596522 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle" (OuterVolumeSpecName: "bundle") pod "fe0bb47e-951d-4aff-8f6d-836c69c46190" (UID: "fe0bb47e-951d-4aff-8f6d-836c69c46190"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:50.597958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.597933 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz" (OuterVolumeSpecName: "kube-api-access-dtxvz") pod "fe0bb47e-951d-4aff-8f6d-836c69c46190" (UID: "fe0bb47e-951d-4aff-8f6d-836c69c46190"). InnerVolumeSpecName "kube-api-access-dtxvz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:57:50.598040 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.597973 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj" (OuterVolumeSpecName: "kube-api-access-l7jjj") pod "6d8b191d-1c87-4735-9354-a90a59a1b45b" (UID: "6d8b191d-1c87-4735-9354-a90a59a1b45b"). InnerVolumeSpecName "kube-api-access-l7jjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:57:50.601428 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.601408 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util" (OuterVolumeSpecName: "util") pod "fe0bb47e-951d-4aff-8f6d-836c69c46190" (UID: "fe0bb47e-951d-4aff-8f6d-836c69c46190"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:50.601507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.601474 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util" (OuterVolumeSpecName: "util") pod "6d8b191d-1c87-4735-9354-a90a59a1b45b" (UID: "6d8b191d-1c87-4735-9354-a90a59a1b45b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:50.696901 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696852 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dtxvz\" (UniqueName: \"kubernetes.io/projected/fe0bb47e-951d-4aff-8f6d-836c69c46190-kube-api-access-dtxvz\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.696901 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696874 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.696901 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696885 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.696901 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696893 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe0bb47e-951d-4aff-8f6d-836c69c46190-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.696901 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696901 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7jjj\" (UniqueName: \"kubernetes.io/projected/6d8b191d-1c87-4735-9354-a90a59a1b45b-kube-api-access-l7jjj\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:50.697092 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:50.696911 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d8b191d-1c87-4735-9354-a90a59a1b45b-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:51.362864 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.362836 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" Apr 16 23:57:51.363311 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.362842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk" event={"ID":"6d8b191d-1c87-4735-9354-a90a59a1b45b","Type":"ContainerDied","Data":"19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9"} Apr 16 23:57:51.363311 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.362948 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19591fbf2fffcfacb08d98c8243c2c0e535b743da9ead773be9a35e982b9dae9" Apr 16 23:57:51.364551 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.364517 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" Apr 16 23:57:51.364675 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.364565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm" event={"ID":"fe0bb47e-951d-4aff-8f6d-836c69c46190","Type":"ContainerDied","Data":"348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29"} Apr 16 23:57:51.364675 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.364594 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348bc7231de6dc84c9fe0a690e512c4c68997982edb8d9c018ce18768f741f29" Apr 16 23:57:51.474747 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.474727 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:51.604168 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.604140 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util\") pod \"fda81de8-7dbc-4bdb-8784-9999433285d0\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " Apr 16 23:57:51.604325 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.604176 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8g4t\" (UniqueName: \"kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t\") pod \"fda81de8-7dbc-4bdb-8784-9999433285d0\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " Apr 16 23:57:51.604325 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.604229 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle\") pod \"fda81de8-7dbc-4bdb-8784-9999433285d0\" (UID: \"fda81de8-7dbc-4bdb-8784-9999433285d0\") " Apr 16 23:57:51.604716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.604690 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle" (OuterVolumeSpecName: "bundle") pod "fda81de8-7dbc-4bdb-8784-9999433285d0" (UID: "fda81de8-7dbc-4bdb-8784-9999433285d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:51.606335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.606306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t" (OuterVolumeSpecName: "kube-api-access-q8g4t") pod "fda81de8-7dbc-4bdb-8784-9999433285d0" (UID: "fda81de8-7dbc-4bdb-8784-9999433285d0"). InnerVolumeSpecName "kube-api-access-q8g4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:57:51.609446 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.609426 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util" (OuterVolumeSpecName: "util") pod "fda81de8-7dbc-4bdb-8784-9999433285d0" (UID: "fda81de8-7dbc-4bdb-8784-9999433285d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:57:51.705090 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.705040 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-util\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:51.705090 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.705059 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8g4t\" (UniqueName: \"kubernetes.io/projected/fda81de8-7dbc-4bdb-8784-9999433285d0-kube-api-access-q8g4t\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:51.705090 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:51.705070 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fda81de8-7dbc-4bdb-8784-9999433285d0-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:57:52.369588 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:52.369560 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" Apr 16 23:57:52.369915 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:52.369555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g" event={"ID":"fda81de8-7dbc-4bdb-8784-9999433285d0","Type":"ContainerDied","Data":"26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9"} Apr 16 23:57:52.369915 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:57:52.369664 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26917546e5cf76ee0dcad14e9087b11e145fd343b657e1f7dd448f8c987b59f9" Apr 16 23:58:01.698949 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.698918 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g"] Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699300 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699318 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699331 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699337 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699348 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699368 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699375 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699381 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699394 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699401 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="pull" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699413 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="extract" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699418 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="extract" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699426 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699431 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699443 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699451 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699457 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699462 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="util" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699474 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="extract" Apr 16 23:58:01.699473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699482 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699490 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699495 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699502 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699507 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699582 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="982e434d-93b4-4c88-bb51-fe62c1113690" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699592 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe0bb47e-951d-4aff-8f6d-836c69c46190" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699598 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fda81de8-7dbc-4bdb-8784-9999433285d0" containerName="extract" Apr 16 23:58:01.700076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.699604 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d8b191d-1c87-4735-9354-a90a59a1b45b" containerName="extract" Apr 16 23:58:01.702737 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.702721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:01.707460 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.707440 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-t6wv9\"" Apr 16 23:58:01.707801 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.707783 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 23:58:01.713363 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.713341 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g"] Apr 16 23:58:01.888061 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.888029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6kx\" (UniqueName: \"kubernetes.io/projected/d5e3eebd-0062-436a-b898-d09992976a88-kube-api-access-gl6kx\") pod \"dns-operator-controller-manager-648d5c98bc-x699g\" (UID: \"d5e3eebd-0062-436a-b898-d09992976a88\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:01.989243 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.989174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6kx\" (UniqueName: \"kubernetes.io/projected/d5e3eebd-0062-436a-b898-d09992976a88-kube-api-access-gl6kx\") pod \"dns-operator-controller-manager-648d5c98bc-x699g\" (UID: \"d5e3eebd-0062-436a-b898-d09992976a88\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:01.998990 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:01.998962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6kx\" (UniqueName: \"kubernetes.io/projected/d5e3eebd-0062-436a-b898-d09992976a88-kube-api-access-gl6kx\") pod \"dns-operator-controller-manager-648d5c98bc-x699g\" (UID: \"d5e3eebd-0062-436a-b898-d09992976a88\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:02.013407 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:02.013384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:02.342604 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:02.342576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g"] Apr 16 23:58:02.344255 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:02.344227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5e3eebd_0062_436a_b898_d09992976a88.slice/crio-44afcc964fe79313f4c648216039973ef75de89fbc461c1c89312d1b96870c48 WatchSource:0}: Error finding container 44afcc964fe79313f4c648216039973ef75de89fbc461c1c89312d1b96870c48: Status 404 returned error can't find the container with id 44afcc964fe79313f4c648216039973ef75de89fbc461c1c89312d1b96870c48 Apr 16 23:58:02.404815 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:02.404787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" event={"ID":"d5e3eebd-0062-436a-b898-d09992976a88","Type":"ContainerStarted","Data":"44afcc964fe79313f4c648216039973ef75de89fbc461c1c89312d1b96870c48"} Apr 16 23:58:05.418705 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.418669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" event={"ID":"d5e3eebd-0062-436a-b898-d09992976a88","Type":"ContainerStarted","Data":"19bf1f34e3e3376d5c3b759d8c4ea2cfb7308793595b8a043e2e845a4a770c30"} Apr 16 23:58:05.419075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.418767 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:05.454700 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.454652 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" podStartSLOduration=2.257284761 podStartE2EDuration="4.454638951s" podCreationTimestamp="2026-04-16 23:58:01 +0000 UTC" firstStartedPulling="2026-04-16 23:58:02.346011916 +0000 UTC m=+463.121036780" lastFinishedPulling="2026-04-16 23:58:04.543366106 +0000 UTC m=+465.318390970" observedRunningTime="2026-04-16 23:58:05.452419525 +0000 UTC m=+466.227444411" watchObservedRunningTime="2026-04-16 23:58:05.454638951 +0000 UTC m=+466.229663834" Apr 16 23:58:05.600335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.600305 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh"] Apr 16 23:58:05.603750 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.603733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:05.606406 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.606381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-6r82n\"" Apr 16 23:58:05.613027 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.613006 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh"] Apr 16 23:58:05.719163 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.719083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4tpj\" (UniqueName: \"kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj\") pod \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" (UID: \"c6f23b98-b4c4-4c27-a23b-3e57169ecf49\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:05.825737 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.821010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4tpj\" (UniqueName: \"kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj\") pod \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" (UID: \"c6f23b98-b4c4-4c27-a23b-3e57169ecf49\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:05.829252 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.829219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4tpj\" (UniqueName: \"kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj\") pod \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" (UID: \"c6f23b98-b4c4-4c27-a23b-3e57169ecf49\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:05.914784 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:05.914754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:06.034884 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:06.034841 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh"] Apr 16 23:58:06.036731 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:06.036702 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f23b98_b4c4_4c27_a23b_3e57169ecf49.slice/crio-3279963d3f4655ab6de45efa4a06bd042d8c03ec2bd053048f31228dd6b6f127 WatchSource:0}: Error finding container 3279963d3f4655ab6de45efa4a06bd042d8c03ec2bd053048f31228dd6b6f127: Status 404 returned error can't find the container with id 3279963d3f4655ab6de45efa4a06bd042d8c03ec2bd053048f31228dd6b6f127 Apr 16 23:58:06.423484 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:06.423452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" event={"ID":"c6f23b98-b4c4-4c27-a23b-3e57169ecf49","Type":"ContainerStarted","Data":"3279963d3f4655ab6de45efa4a06bd042d8c03ec2bd053048f31228dd6b6f127"} Apr 16 23:58:08.432026 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:08.431988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" event={"ID":"c6f23b98-b4c4-4c27-a23b-3e57169ecf49","Type":"ContainerStarted","Data":"83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94"} Apr 16 23:58:08.432392 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:08.432035 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:08.451247 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:08.451200 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" podStartSLOduration=1.733915927 podStartE2EDuration="3.451187334s" podCreationTimestamp="2026-04-16 23:58:05 +0000 UTC" firstStartedPulling="2026-04-16 23:58:06.039237226 +0000 UTC m=+466.814262090" lastFinishedPulling="2026-04-16 23:58:07.756508612 +0000 UTC m=+468.531533497" observedRunningTime="2026-04-16 23:58:08.448222334 +0000 UTC m=+469.223247219" watchObservedRunningTime="2026-04-16 23:58:08.451187334 +0000 UTC m=+469.226212220" Apr 16 23:58:14.318854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.318823 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq"] Apr 16 23:58:14.322368 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.322353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.325127 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.325104 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rdzkz\"" Apr 16 23:58:14.333782 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.333760 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq"] Apr 16 23:58:14.386716 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.386694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.386822 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.386749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5r8\" (UniqueName: \"kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.487517 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.487487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.487734 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.487596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5r8\" (UniqueName: \"kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.487883 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.487858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.496001 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.495972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5r8\" (UniqueName: \"kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.633854 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.633790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:14.966889 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:14.966861 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq"] Apr 16 23:58:14.967044 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:14.967022 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f71d2c1_ac5f_4292_8735_16fb86f3b981.slice/crio-c3397694ae73b6984a809f279e23e644de18f585b92094a41da77362f33f2585 WatchSource:0}: Error finding container c3397694ae73b6984a809f279e23e644de18f585b92094a41da77362f33f2585: Status 404 returned error can't find the container with id c3397694ae73b6984a809f279e23e644de18f585b92094a41da77362f33f2585 Apr 16 23:58:15.459504 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:15.459470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" event={"ID":"3f71d2c1-ac5f-4292-8735-16fb86f3b981","Type":"ContainerStarted","Data":"c3397694ae73b6984a809f279e23e644de18f585b92094a41da77362f33f2585"} Apr 16 23:58:16.426434 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:16.426405 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-x699g" Apr 16 23:58:19.439294 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:19.439260 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:20.483719 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:20.483686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" event={"ID":"3f71d2c1-ac5f-4292-8735-16fb86f3b981","Type":"ContainerStarted","Data":"bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08"} Apr 16 23:58:20.484075 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:20.483790 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:20.502692 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:20.502650 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" podStartSLOduration=1.638972361 podStartE2EDuration="6.502637795s" podCreationTimestamp="2026-04-16 23:58:14 +0000 UTC" firstStartedPulling="2026-04-16 23:58:14.969389728 +0000 UTC m=+475.744414593" lastFinishedPulling="2026-04-16 23:58:19.833055162 +0000 UTC m=+480.608080027" observedRunningTime="2026-04-16 23:58:20.500227832 +0000 UTC m=+481.275252727" watchObservedRunningTime="2026-04-16 23:58:20.502637795 +0000 UTC m=+481.277662680" Apr 16 23:58:22.572507 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.572475 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc"] Apr 16 23:58:22.576356 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.576340 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.578913 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.578888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 23:58:22.579050 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.578936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-skvbv\"" Apr 16 23:58:22.579050 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.578982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 23:58:22.585253 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.585232 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc"] Apr 16 23:58:22.658932 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.658894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr2cl\" (UniqueName: \"kubernetes.io/projected/0674c5b4-04dc-4b5c-9591-ea0023d88508-kube-api-access-jr2cl\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.659084 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.658957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0674c5b4-04dc-4b5c-9591-ea0023d88508-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.659084 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.658983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0674c5b4-04dc-4b5c-9591-ea0023d88508-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.759761 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.759734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0674c5b4-04dc-4b5c-9591-ea0023d88508-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.759917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.759774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0674c5b4-04dc-4b5c-9591-ea0023d88508-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.759917 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.759805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr2cl\" (UniqueName: \"kubernetes.io/projected/0674c5b4-04dc-4b5c-9591-ea0023d88508-kube-api-access-jr2cl\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.760375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.760351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0674c5b4-04dc-4b5c-9591-ea0023d88508-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.762038 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.762014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0674c5b4-04dc-4b5c-9591-ea0023d88508-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.766792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.766773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr2cl\" (UniqueName: \"kubernetes.io/projected/0674c5b4-04dc-4b5c-9591-ea0023d88508-kube-api-access-jr2cl\") pod \"kuadrant-console-plugin-6cb54b5c86-ndqrc\" (UID: \"0674c5b4-04dc-4b5c-9591-ea0023d88508\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:22.886772 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:22.886712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" Apr 16 23:58:23.003052 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.003022 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc"] Apr 16 23:58:23.004626 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:23.004598 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0674c5b4_04dc_4b5c_9591_ea0023d88508.slice/crio-ed7f4da78ee57d18a6d8b5996733f69775754c7c88d19b657512d41a2cc33c67 WatchSource:0}: Error finding container ed7f4da78ee57d18a6d8b5996733f69775754c7c88d19b657512d41a2cc33c67: Status 404 returned error can't find the container with id ed7f4da78ee57d18a6d8b5996733f69775754c7c88d19b657512d41a2cc33c67 Apr 16 23:58:23.368841 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.368811 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-679b9d47f4-9744s"] Apr 16 23:58:23.373643 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.373622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.383706 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.383676 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b9d47f4-9744s"] Apr 16 23:58:23.464044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-oauth-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464172 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-trusted-ca-bundle\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464172 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464074 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xmd\" (UniqueName: \"kubernetes.io/projected/cf0b0aac-d126-45b6-b3a3-d222909d94ec-kube-api-access-c2xmd\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464172 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464286 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464286 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-oauth-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.464286 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.464234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-service-ca\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.498021 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.497994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" event={"ID":"0674c5b4-04dc-4b5c-9591-ea0023d88508","Type":"ContainerStarted","Data":"ed7f4da78ee57d18a6d8b5996733f69775754c7c88d19b657512d41a2cc33c67"} Apr 16 23:58:23.564825 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-oauth-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.564954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-trusted-ca-bundle\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.564954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xmd\" (UniqueName: \"kubernetes.io/projected/cf0b0aac-d126-45b6-b3a3-d222909d94ec-kube-api-access-c2xmd\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.564954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.564954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.564954 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-oauth-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.565230 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.564975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-service-ca\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.565725 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.565689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-service-ca\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.565882 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.565851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.565991 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.565925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-oauth-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.566245 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.566213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0b0aac-d126-45b6-b3a3-d222909d94ec-trusted-ca-bundle\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.567556 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.567518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-oauth-config\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.567720 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.567703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0b0aac-d126-45b6-b3a3-d222909d94ec-console-serving-cert\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.573646 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.573625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xmd\" (UniqueName: \"kubernetes.io/projected/cf0b0aac-d126-45b6-b3a3-d222909d94ec-kube-api-access-c2xmd\") pod \"console-679b9d47f4-9744s\" (UID: \"cf0b0aac-d126-45b6-b3a3-d222909d94ec\") " pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.683767 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.683689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:23.832146 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:23.832115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679b9d47f4-9744s"] Apr 16 23:58:23.834029 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:23.833995 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0b0aac_d126_45b6_b3a3_d222909d94ec.slice/crio-be5c3e246a59603df598d198c7168e4c5b1d43ff884aaba03359ae8d4e7f9ad0 WatchSource:0}: Error finding container be5c3e246a59603df598d198c7168e4c5b1d43ff884aaba03359ae8d4e7f9ad0: Status 404 returned error can't find the container with id be5c3e246a59603df598d198c7168e4c5b1d43ff884aaba03359ae8d4e7f9ad0 Apr 16 23:58:24.508091 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:24.508048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b9d47f4-9744s" event={"ID":"cf0b0aac-d126-45b6-b3a3-d222909d94ec","Type":"ContainerStarted","Data":"666737e5f474274e0afc196dd54c2e2b36163320c0bce875f222a1a2e9c9ed57"} Apr 16 23:58:24.508257 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:24.508103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679b9d47f4-9744s" event={"ID":"cf0b0aac-d126-45b6-b3a3-d222909d94ec","Type":"ContainerStarted","Data":"be5c3e246a59603df598d198c7168e4c5b1d43ff884aaba03359ae8d4e7f9ad0"} Apr 16 23:58:24.534042 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:24.533379 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-679b9d47f4-9744s" podStartSLOduration=1.533360204 podStartE2EDuration="1.533360204s" podCreationTimestamp="2026-04-16 23:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:58:24.530316906 +0000 UTC m=+485.305341793" watchObservedRunningTime="2026-04-16 23:58:24.533360204 +0000 UTC m=+485.308385089" Apr 16 23:58:31.490617 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:31.490587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:32.452940 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.452908 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr"] Apr 16 23:58:32.469213 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.469188 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr"] Apr 16 23:58:32.469378 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.469311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.536335 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.536289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.536730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.536363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24x42\" (UniqueName: \"kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.637050 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.637023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.637214 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.637074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24x42\" (UniqueName: \"kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.637504 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.637479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.646437 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.646390 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24x42\" (UniqueName: \"kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:32.782357 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:32.782298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:33.139630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.139523 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq"] Apr 16 23:58:33.139896 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.139867 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" containerName="manager" containerID="cri-o://bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08" gracePeriod=2 Apr 16 23:58:33.148905 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.148876 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq"] Apr 16 23:58:33.156384 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.156315 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:33.156951 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.156928 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" containerName="manager" Apr 16 23:58:33.156951 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.156951 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" containerName="manager" Apr 16 23:58:33.157116 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.157056 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" containerName="manager" Apr 16 23:58:33.163187 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.163096 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr"] Apr 16 23:58:33.163187 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.163147 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr"] Apr 16 23:58:33.163360 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.163278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.165672 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.165639 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:33.177662 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.177628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:33.181568 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.181363 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh"] Apr 16 23:58:33.182189 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.182158 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" containerName="manager" containerID="cri-o://83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94" gracePeriod=2 Apr 16 23:58:33.185879 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.185562 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:58:33.191117 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.191097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.197839 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.197798 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:33.202143 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.200316 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:33.204740 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.204717 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh"] Apr 16 23:58:33.211365 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.211341 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:58:33.242349 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.242321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txq6\" (UniqueName: \"kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.242500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.242391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.242500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.242452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.242500 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.242483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb8cg\" (UniqueName: \"kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.343777 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.343745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.343958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.343801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.343958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.343843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb8cg\" (UniqueName: \"kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.343958 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.343933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8txq6\" (UniqueName: \"kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.344163 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.344139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.344224 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.344197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.354416 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.354354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb8cg\" (UniqueName: \"kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7ffv8\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.354416 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.354354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txq6\" (UniqueName: \"kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.558319 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.558284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:33.566070 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.566048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:33.684721 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.684682 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:33.684721 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.684719 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:33.689827 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.689801 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:33.691980 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.691953 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:33.716493 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:33.716454 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:34.561508 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:34.561476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-679b9d47f4-9744s" Apr 16 23:58:34.620235 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:34.620196 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:58:44.522852 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.521874 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:44.524470 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.524415 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.542200 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.542139 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:58:44.542910 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:44.542867 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd76a04_0e97_49e2_a93a_d67694e67aca.slice/crio-be37fa759e13045c88c79724155eb0b05b1c8ca5fb92b667bc607b1e281b0b44 WatchSource:0}: Error finding container be37fa759e13045c88c79724155eb0b05b1c8ca5fb92b667bc607b1e281b0b44: Status 404 returned error can't find the container with id be37fa759e13045c88c79724155eb0b05b1c8ca5fb92b667bc607b1e281b0b44 Apr 16 23:58:44.547156 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.547127 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:44.549199 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.549169 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.550945 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.550918 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.567456 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.567432 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:44.569472 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:44.569435 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bf94dd8_ec37_4767_b906_544d082de60f.slice/crio-1abb88e2fbdcfb405e25f141c21d13b91ad9c707303ae22fc00ce3f157635598 WatchSource:0}: Error finding container 1abb88e2fbdcfb405e25f141c21d13b91ad9c707303ae22fc00ce3f157635598: Status 404 returned error can't find the container with id 1abb88e2fbdcfb405e25f141c21d13b91ad9c707303ae22fc00ce3f157635598 Apr 16 23:58:44.604848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.604825 2578 generic.go:358] "Generic (PLEG): container finished" podID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" containerID="83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94" exitCode=0 Apr 16 23:58:44.604930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.604879 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" Apr 16 23:58:44.604930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.604896 2578 scope.go:117] "RemoveContainer" containerID="83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94" Apr 16 23:58:44.606318 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.606290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" event={"ID":"7bf94dd8-ec37-4767-b906-544d082de60f","Type":"ContainerStarted","Data":"1abb88e2fbdcfb405e25f141c21d13b91ad9c707303ae22fc00ce3f157635598"} Apr 16 23:58:44.607069 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.607042 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.607896 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.607866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" event={"ID":"0674c5b4-04dc-4b5c-9591-ea0023d88508","Type":"ContainerStarted","Data":"de49cab9d8fed1dea6bedaa2e90e4f530ab995cfb0e39df9c06cc9d7d7ed44ae"} Apr 16 23:58:44.609044 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.609018 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.609261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.609142 2578 generic.go:358] "Generic (PLEG): container finished" podID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" containerID="bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08" exitCode=0 Apr 16 23:58:44.609473 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.609439 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" Apr 16 23:58:44.610867 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.610844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" event={"ID":"4cd76a04-0e97-49e2-a93a-d67694e67aca","Type":"ContainerStarted","Data":"be37fa759e13045c88c79724155eb0b05b1c8ca5fb92b667bc607b1e281b0b44"} Apr 16 23:58:44.610952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.610899 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:44.610952 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.610894 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.613396 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.613378 2578 scope.go:117] "RemoveContainer" containerID="83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94" Apr 16 23:58:44.613660 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:58:44.613631 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94\": container with ID starting with 83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94 not found: ID does not exist" containerID="83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94" Apr 16 23:58:44.613746 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.613658 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94"} err="failed to get container status \"83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94\": rpc error: code = NotFound desc = could not find container \"83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94\": container with ID starting with 83540f16bbdf719d3ba7e3d3e3e55ab80eaeb42efbe852a34f30058bb00baa94 not found: ID does not exist" Apr 16 23:58:44.613746 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.613675 2578 scope.go:117] "RemoveContainer" containerID="bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08" Apr 16 23:58:44.622469 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.622451 2578 scope.go:117] "RemoveContainer" containerID="bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08" Apr 16 23:58:44.622747 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:58:44.622729 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08\": container with ID starting with bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08 not found: ID does not exist" containerID="bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08" Apr 16 23:58:44.622798 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.622752 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08"} err="failed to get container status \"bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08\": rpc error: code = NotFound desc = could not find container \"bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08\": container with ID starting with bd43d838bda67cccea65a5369abad70a8b4f8767d1db44f5a0e5d2071a74ea08 not found: ID does not exist" Apr 16 23:58:44.626372 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.626342 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.627436 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.627399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ndqrc" podStartSLOduration=1.163153429 podStartE2EDuration="22.627389033s" podCreationTimestamp="2026-04-16 23:58:22 +0000 UTC" firstStartedPulling="2026-04-16 23:58:23.006359551 +0000 UTC m=+483.781384415" lastFinishedPulling="2026-04-16 23:58:44.470595143 +0000 UTC m=+505.245620019" observedRunningTime="2026-04-16 23:58:44.624242893 +0000 UTC m=+505.399267780" watchObservedRunningTime="2026-04-16 23:58:44.627389033 +0000 UTC m=+505.402413919" Apr 16 23:58:44.628317 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.628296 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.630144 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.630120 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.647478 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.647436 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" podStartSLOduration=11.647424551 podStartE2EDuration="11.647424551s" podCreationTimestamp="2026-04-16 23:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:58:44.646188312 +0000 UTC m=+505.421213198" watchObservedRunningTime="2026-04-16 23:58:44.647424551 +0000 UTC m=+505.422449436" Apr 16 23:58:44.649792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.648843 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p5r8\" (UniqueName: \"kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8\") pod \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " Apr 16 23:58:44.649792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.648883 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4tpj\" (UniqueName: \"kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj\") pod \"c6f23b98-b4c4-4c27-a23b-3e57169ecf49\" (UID: \"c6f23b98-b4c4-4c27-a23b-3e57169ecf49\") " Apr 16 23:58:44.649792 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.648942 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume\") pod \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\" (UID: \"3f71d2c1-ac5f-4292-8735-16fb86f3b981\") " Apr 16 23:58:44.650661 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.650629 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3f71d2c1-ac5f-4292-8735-16fb86f3b981" (UID: "3f71d2c1-ac5f-4292-8735-16fb86f3b981"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:58:44.652224 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.652197 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8" (OuterVolumeSpecName: "kube-api-access-5p5r8") pod "3f71d2c1-ac5f-4292-8735-16fb86f3b981" (UID: "3f71d2c1-ac5f-4292-8735-16fb86f3b981"). InnerVolumeSpecName "kube-api-access-5p5r8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:58:44.652895 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.652871 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj" (OuterVolumeSpecName: "kube-api-access-b4tpj") pod "c6f23b98-b4c4-4c27-a23b-3e57169ecf49" (UID: "c6f23b98-b4c4-4c27-a23b-3e57169ecf49"). InnerVolumeSpecName "kube-api-access-b4tpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:58:44.750602 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.750571 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f71d2c1-ac5f-4292-8735-16fb86f3b981-extensions-socket-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:44.750602 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.750600 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5p5r8\" (UniqueName: \"kubernetes.io/projected/3f71d2c1-ac5f-4292-8735-16fb86f3b981-kube-api-access-5p5r8\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:44.750763 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.750611 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b4tpj\" (UniqueName: \"kubernetes.io/projected/c6f23b98-b4c4-4c27-a23b-3e57169ecf49-kube-api-access-b4tpj\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:44.918838 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.918797 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.920637 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.920606 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.925059 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.925031 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:44.926842 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:44.926817 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:45.620326 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.620235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" event={"ID":"7bf94dd8-ec37-4767-b906-544d082de60f","Type":"ContainerStarted","Data":"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f"} Apr 16 23:58:45.620326 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.620289 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:45.622387 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.622362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" event={"ID":"4cd76a04-0e97-49e2-a93a-d67694e67aca","Type":"ContainerStarted","Data":"3975391d4a81a2114beeb0e68ec5105d959416210de2b9d8feadb9733cb30f3b"} Apr 16 23:58:45.622600 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.622580 2578 status_manager.go:895] "Failed to get status for pod" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-6hvnh" err="pods \"limitador-operator-controller-manager-85c4996f8c-6hvnh\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:45.626707 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.626659 2578 status_manager.go:895] "Failed to get status for pod" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fjrnq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fjrnq\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:45.645817 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.645782 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" podStartSLOduration=12.645768488 podStartE2EDuration="12.645768488s" podCreationTimestamp="2026-04-16 23:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:58:45.644080389 +0000 UTC m=+506.419105275" watchObservedRunningTime="2026-04-16 23:58:45.645768488 +0000 UTC m=+506.420793377" Apr 16 23:58:45.818870 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.818841 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f71d2c1-ac5f-4292-8735-16fb86f3b981" path="/var/lib/kubelet/pods/3f71d2c1-ac5f-4292-8735-16fb86f3b981/volumes" Apr 16 23:58:45.819160 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:45.819146 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" path="/var/lib/kubelet/pods/c6f23b98-b4c4-4c27-a23b-3e57169ecf49/volumes" Apr 16 23:58:46.906314 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:46.906273 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee9a1cd_52cf_4d46_b821_2566b71c706f.slice/crio-f2c082c917bbac22b48b7b3ae79d2522b01cb8a7e02a852e173489dea398257e WatchSource:0}: Error finding container f2c082c917bbac22b48b7b3ae79d2522b01cb8a7e02a852e173489dea398257e: Status 404 returned error can't find the container with id f2c082c917bbac22b48b7b3ae79d2522b01cb8a7e02a852e173489dea398257e Apr 16 23:58:47.632938 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.632903 2578 generic.go:358] "Generic (PLEG): container finished" podID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" containerID="251e04fb536d0953c44b616a2b9ceff044c04a993d1eb91e7153d125b32fa397" exitCode=1 Apr 16 23:58:47.635306 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.635273 2578 status_manager.go:895] "Failed to get status for pod" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:47.666554 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.666516 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:47.668630 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.668605 2578 status_manager.go:895] "Failed to get status for pod" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:47.774375 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.774353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24x42\" (UniqueName: \"kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42\") pod \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " Apr 16 23:58:47.774487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.774386 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume\") pod \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\" (UID: \"0ee9a1cd-52cf-4d46-b821-2566b71c706f\") " Apr 16 23:58:47.774718 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.774682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0ee9a1cd-52cf-4d46-b821-2566b71c706f" (UID: "0ee9a1cd-52cf-4d46-b821-2566b71c706f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:58:47.776278 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.776259 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42" (OuterVolumeSpecName: "kube-api-access-24x42") pod "0ee9a1cd-52cf-4d46-b821-2566b71c706f" (UID: "0ee9a1cd-52cf-4d46-b821-2566b71c706f"). InnerVolumeSpecName "kube-api-access-24x42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:58:47.819002 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.818972 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" path="/var/lib/kubelet/pods/0ee9a1cd-52cf-4d46-b821-2566b71c706f/volumes" Apr 16 23:58:47.875835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.875805 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24x42\" (UniqueName: \"kubernetes.io/projected/0ee9a1cd-52cf-4d46-b821-2566b71c706f-kube-api-access-24x42\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:47.875835 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:47.875826 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ee9a1cd-52cf-4d46-b821-2566b71c706f-extensions-socket-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:48.638760 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:48.638730 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" Apr 16 23:58:48.639205 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:48.638728 2578 scope.go:117] "RemoveContainer" containerID="251e04fb536d0953c44b616a2b9ceff044c04a993d1eb91e7153d125b32fa397" Apr 16 23:58:48.640996 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:48.640967 2578 status_manager.go:895] "Failed to get status for pod" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:48.642957 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:48.642929 2578 status_manager.go:895] "Failed to get status for pod" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:49.822988 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:49.822931 2578 status_manager.go:895] "Failed to get status for pod" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-b5vrr\" is forbidden: User \"system:node:ip-10-0-134-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-103.ec2.internal' and this object" Apr 16 23:58:56.628789 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.628755 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:56.629221 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.628805 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:58:56.693085 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.693054 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:56.693270 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.693234 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" podUID="7bf94dd8-ec37-4767-b906-544d082de60f" containerName="manager" containerID="cri-o://ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f" gracePeriod=10 Apr 16 23:58:56.936522 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.936497 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:56.957261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957236 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 16 23:58:56.957649 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957634 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bf94dd8-ec37-4767-b906-544d082de60f" containerName="manager" Apr 16 23:58:56.957698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957653 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf94dd8-ec37-4767-b906-544d082de60f" containerName="manager" Apr 16 23:58:56.957698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957669 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" containerName="manager" Apr 16 23:58:56.957698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957675 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" containerName="manager" Apr 16 23:58:56.957698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957685 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" containerName="manager" Apr 16 23:58:56.957698 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957692 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" containerName="manager" Apr 16 23:58:56.957855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957753 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6f23b98-b4c4-4c27-a23b-3e57169ecf49" containerName="manager" Apr 16 23:58:56.957855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957762 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bf94dd8-ec37-4767-b906-544d082de60f" containerName="manager" Apr 16 23:58:56.957855 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.957768 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ee9a1cd-52cf-4d46-b821-2566b71c706f" containerName="manager" Apr 16 23:58:56.960913 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.960892 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:56.975559 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:56.975521 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 16 23:58:57.050607 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.050579 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume\") pod \"7bf94dd8-ec37-4767-b906-544d082de60f\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " Apr 16 23:58:57.050745 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.050684 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb8cg\" (UniqueName: \"kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg\") pod \"7bf94dd8-ec37-4767-b906-544d082de60f\" (UID: \"7bf94dd8-ec37-4767-b906-544d082de60f\") " Apr 16 23:58:57.050808 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.050793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxcb6\" (UniqueName: \"kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.050856 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.050842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.050975 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.050954 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7bf94dd8-ec37-4767-b906-544d082de60f" (UID: "7bf94dd8-ec37-4767-b906-544d082de60f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:58:57.052682 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.052653 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg" (OuterVolumeSpecName: "kube-api-access-xb8cg") pod "7bf94dd8-ec37-4767-b906-544d082de60f" (UID: "7bf94dd8-ec37-4767-b906-544d082de60f"). InnerVolumeSpecName "kube-api-access-xb8cg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:58:57.151575 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.151492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxcb6\" (UniqueName: \"kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.151575 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.151572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.151715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.151638 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xb8cg\" (UniqueName: \"kubernetes.io/projected/7bf94dd8-ec37-4767-b906-544d082de60f-kube-api-access-xb8cg\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:57.151715 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.151648 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bf94dd8-ec37-4767-b906-544d082de60f-extensions-socket-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:57.151930 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.151912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.161874 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.161854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxcb6\" (UniqueName: \"kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wxl4h\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.272806 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.272778 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.398892 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.398865 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 16 23:58:57.400921 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:58:57.400892 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e05964_8792_440a_bf89_19d57677b3e9.slice/crio-df6750d47a8576b31878426b341d598eec0224dd077b86f962d1843003f85359 WatchSource:0}: Error finding container df6750d47a8576b31878426b341d598eec0224dd077b86f962d1843003f85359: Status 404 returned error can't find the container with id df6750d47a8576b31878426b341d598eec0224dd077b86f962d1843003f85359 Apr 16 23:58:57.679444 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.679337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" event={"ID":"66e05964-8792-440a-bf89-19d57677b3e9","Type":"ContainerStarted","Data":"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07"} Apr 16 23:58:57.679444 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.679403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" event={"ID":"66e05964-8792-440a-bf89-19d57677b3e9","Type":"ContainerStarted","Data":"df6750d47a8576b31878426b341d598eec0224dd077b86f962d1843003f85359"} Apr 16 23:58:57.679444 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.679427 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:58:57.680561 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.680514 2578 generic.go:358] "Generic (PLEG): container finished" podID="7bf94dd8-ec37-4767-b906-544d082de60f" containerID="ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f" exitCode=0 Apr 16 23:58:57.680666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.680572 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" Apr 16 23:58:57.680666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.680579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" event={"ID":"7bf94dd8-ec37-4767-b906-544d082de60f","Type":"ContainerDied","Data":"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f"} Apr 16 23:58:57.680666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.680608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8" event={"ID":"7bf94dd8-ec37-4767-b906-544d082de60f","Type":"ContainerDied","Data":"1abb88e2fbdcfb405e25f141c21d13b91ad9c707303ae22fc00ce3f157635598"} Apr 16 23:58:57.680666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.680624 2578 scope.go:117] "RemoveContainer" containerID="ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f" Apr 16 23:58:57.692166 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.690158 2578 scope.go:117] "RemoveContainer" containerID="ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f" Apr 16 23:58:57.692317 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:58:57.692291 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f\": container with ID starting with ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f not found: ID does not exist" containerID="ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f" Apr 16 23:58:57.692384 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.692328 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f"} err="failed to get container status \"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f\": rpc error: code = NotFound desc = could not find container \"ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f\": container with ID starting with ac27a8fb65c069a4923db091ec793be2ca365519e78b1a34de45ce513430f77f not found: ID does not exist" Apr 16 23:58:57.697594 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.697528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" podStartSLOduration=1.6975158399999999 podStartE2EDuration="1.69751584s" podCreationTimestamp="2026-04-16 23:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:58:57.696201619 +0000 UTC m=+518.471226506" watchObservedRunningTime="2026-04-16 23:58:57.69751584 +0000 UTC m=+518.472540725" Apr 16 23:58:57.711176 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.711125 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:57.714941 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.714916 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7ffv8"] Apr 16 23:58:57.817932 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:57.817904 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf94dd8-ec37-4767-b906-544d082de60f" path="/var/lib/kubelet/pods/7bf94dd8-ec37-4767-b906-544d082de60f/volumes" Apr 16 23:58:59.644730 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.644672 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5bcbddfb6c-6ffgl" podUID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" containerName="console" containerID="cri-o://022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f" gracePeriod=15 Apr 16 23:58:59.892859 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.892833 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bcbddfb6c-6ffgl_2168ed9f-ae27-4544-ae23-ce14d7d6f640/console/0.log" Apr 16 23:58:59.892969 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.892891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:58:59.974516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974420 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ln8n\" (UniqueName: \"kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974475 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974516 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974499 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974788 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974597 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974788 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974788 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974725 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.974951 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974896 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert\") pod \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\" (UID: \"2168ed9f-ae27-4544-ae23-ce14d7d6f640\") " Apr 16 23:58:59.975016 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.974999 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:58:59.975076 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975007 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca" (OuterVolumeSpecName: "service-ca") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:58:59.975128 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config" (OuterVolumeSpecName: "console-config") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:58:59.975201 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975128 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:58:59.975355 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975334 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-trusted-ca-bundle\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:59.975421 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975360 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:59.975421 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975376 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-service-ca\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:59.975421 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.975385 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2168ed9f-ae27-4544-ae23-ce14d7d6f640-oauth-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:58:59.976766 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.976743 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:58:59.976878 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.976817 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n" (OuterVolumeSpecName: "kube-api-access-6ln8n") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "kube-api-access-6ln8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:58:59.976923 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:58:59.976897 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2168ed9f-ae27-4544-ae23-ce14d7d6f640" (UID: "2168ed9f-ae27-4544-ae23-ce14d7d6f640"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:59:00.076408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.076375 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-oauth-config\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:59:00.076408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.076403 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2168ed9f-ae27-4544-ae23-ce14d7d6f640-console-serving-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:59:00.076408 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.076413 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ln8n\" (UniqueName: \"kubernetes.io/projected/2168ed9f-ae27-4544-ae23-ce14d7d6f640-kube-api-access-6ln8n\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:59:00.693447 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693419 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bcbddfb6c-6ffgl_2168ed9f-ae27-4544-ae23-ce14d7d6f640/console/0.log" Apr 16 23:59:00.693848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693457 2578 generic.go:358] "Generic (PLEG): container finished" podID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" containerID="022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f" exitCode=2 Apr 16 23:59:00.693848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcbddfb6c-6ffgl" event={"ID":"2168ed9f-ae27-4544-ae23-ce14d7d6f640","Type":"ContainerDied","Data":"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f"} Apr 16 23:59:00.693848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcbddfb6c-6ffgl" event={"ID":"2168ed9f-ae27-4544-ae23-ce14d7d6f640","Type":"ContainerDied","Data":"ff8f2d16e5954a14492c40259c616460d4aea8451983937cc1fd8aef03e1938e"} Apr 16 23:59:00.693848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693524 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcbddfb6c-6ffgl" Apr 16 23:59:00.693848 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.693547 2578 scope.go:117] "RemoveContainer" containerID="022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f" Apr 16 23:59:00.702666 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.702650 2578 scope.go:117] "RemoveContainer" containerID="022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f" Apr 16 23:59:00.702910 ip-10-0-134-103 kubenswrapper[2578]: E0416 23:59:00.702890 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f\": container with ID starting with 022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f not found: ID does not exist" containerID="022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f" Apr 16 23:59:00.702962 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.702918 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f"} err="failed to get container status \"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f\": rpc error: code = NotFound desc = could not find container \"022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f\": container with ID starting with 022f43ce3cd6298ad8e626afaf6f8e96952a9cf75dd9869b3c604e3d7a72e10f not found: ID does not exist" Apr 16 23:59:00.715184 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.715164 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:59:00.720345 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:00.720325 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5bcbddfb6c-6ffgl"] Apr 16 23:59:01.818014 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:01.817983 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" path="/var/lib/kubelet/pods/2168ed9f-ae27-4544-ae23-ce14d7d6f640/volumes" Apr 16 23:59:08.688074 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:08.688044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 16 23:59:08.734289 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:08.734260 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:59:08.734523 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:08.734497 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" podUID="4cd76a04-0e97-49e2-a93a-d67694e67aca" containerName="manager" containerID="cri-o://3975391d4a81a2114beeb0e68ec5105d959416210de2b9d8feadb9733cb30f3b" gracePeriod=10 Apr 16 23:59:09.730080 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:09.730009 2578 generic.go:358] "Generic (PLEG): container finished" podID="4cd76a04-0e97-49e2-a93a-d67694e67aca" containerID="3975391d4a81a2114beeb0e68ec5105d959416210de2b9d8feadb9733cb30f3b" exitCode=0 Apr 16 23:59:09.730462 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:09.730082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" event={"ID":"4cd76a04-0e97-49e2-a93a-d67694e67aca","Type":"ContainerDied","Data":"3975391d4a81a2114beeb0e68ec5105d959416210de2b9d8feadb9733cb30f3b"} Apr 16 23:59:10.096288 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.096266 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:59:10.258999 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.258971 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume\") pod \"4cd76a04-0e97-49e2-a93a-d67694e67aca\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " Apr 16 23:59:10.259174 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.259040 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8txq6\" (UniqueName: \"kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6\") pod \"4cd76a04-0e97-49e2-a93a-d67694e67aca\" (UID: \"4cd76a04-0e97-49e2-a93a-d67694e67aca\") " Apr 16 23:59:10.259423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.259399 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4cd76a04-0e97-49e2-a93a-d67694e67aca" (UID: "4cd76a04-0e97-49e2-a93a-d67694e67aca"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:59:10.261118 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.261097 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6" (OuterVolumeSpecName: "kube-api-access-8txq6") pod "4cd76a04-0e97-49e2-a93a-d67694e67aca" (UID: "4cd76a04-0e97-49e2-a93a-d67694e67aca"). InnerVolumeSpecName "kube-api-access-8txq6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:59:10.359898 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.359871 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cd76a04-0e97-49e2-a93a-d67694e67aca-extensions-socket-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:59:10.359898 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.359894 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8txq6\" (UniqueName: \"kubernetes.io/projected/4cd76a04-0e97-49e2-a93a-d67694e67aca-kube-api-access-8txq6\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 16 23:59:10.735487 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.735407 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" Apr 16 23:59:10.735919 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.735414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4" event={"ID":"4cd76a04-0e97-49e2-a93a-d67694e67aca","Type":"ContainerDied","Data":"be37fa759e13045c88c79724155eb0b05b1c8ca5fb92b667bc607b1e281b0b44"} Apr 16 23:59:10.735919 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.735520 2578 scope.go:117] "RemoveContainer" containerID="3975391d4a81a2114beeb0e68ec5105d959416210de2b9d8feadb9733cb30f3b" Apr 16 23:59:10.760129 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.760102 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:59:10.765457 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:10.765433 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mwkn4"] Apr 16 23:59:11.818338 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:11.818301 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd76a04-0e97-49e2-a93a-d67694e67aca" path="/var/lib/kubelet/pods/4cd76a04-0e97-49e2-a93a-d67694e67aca/volumes" Apr 16 23:59:12.992515 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.992481 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn"] Apr 16 23:59:12.993122 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993086 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cd76a04-0e97-49e2-a93a-d67694e67aca" containerName="manager" Apr 16 23:59:12.993122 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993119 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd76a04-0e97-49e2-a93a-d67694e67aca" containerName="manager" Apr 16 23:59:12.993310 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993160 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" containerName="console" Apr 16 23:59:12.993310 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993168 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" containerName="console" Apr 16 23:59:12.993310 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993270 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cd76a04-0e97-49e2-a93a-d67694e67aca" containerName="manager" Apr 16 23:59:12.993310 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:12.993284 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2168ed9f-ae27-4544-ae23-ce14d7d6f640" containerName="console" Apr 16 23:59:13.028818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.028794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn"] Apr 16 23:59:13.028949 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.028903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.031385 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.031368 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-ntnrr\"" Apr 16 23:59:13.183964 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.183936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184089 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.183991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c243f98-0358-4d85-b9af-5d7dac8da24b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184089 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t86f\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-kube-api-access-4t86f\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184089 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184089 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184089 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184156 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.184261 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.184220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285085 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c243f98-0358-4d85-b9af-5d7dac8da24b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t86f\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-kube-api-access-4t86f\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285227 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285402 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285606 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285667 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285667 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285818 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.285877 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.285851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5c243f98-0358-4d85-b9af-5d7dac8da24b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.287777 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.287754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.287950 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.287933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.292764 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.292741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.292994 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.292974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t86f\" (UniqueName: \"kubernetes.io/projected/5c243f98-0358-4d85-b9af-5d7dac8da24b-kube-api-access-4t86f\") pod \"maas-default-gateway-openshift-default-58b6f876-m5tdn\" (UID: \"5c243f98-0358-4d85-b9af-5d7dac8da24b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.338991 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.338970 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:13.673592 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.673557 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn"] Apr 16 23:59:13.676211 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:59:13.676182 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c243f98_0358_4d85_b9af_5d7dac8da24b.slice/crio-a0a7b08e76a3bedfa32cc6d463d83a5ae32e38d6a242605a68e54d75cf11c3f3 WatchSource:0}: Error finding container a0a7b08e76a3bedfa32cc6d463d83a5ae32e38d6a242605a68e54d75cf11c3f3: Status 404 returned error can't find the container with id a0a7b08e76a3bedfa32cc6d463d83a5ae32e38d6a242605a68e54d75cf11c3f3 Apr 16 23:59:13.678328 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.678292 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:59:13.678439 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.678364 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:59:13.678439 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.678427 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 23:59:13.751456 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.751422 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" event={"ID":"5c243f98-0358-4d85-b9af-5d7dac8da24b","Type":"ContainerStarted","Data":"c341d16621d71c5fd6c72d24f4595ac9356eb3d2663e13b9649c7eb5b7763a77"} Apr 16 23:59:13.751571 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.751465 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" event={"ID":"5c243f98-0358-4d85-b9af-5d7dac8da24b","Type":"ContainerStarted","Data":"a0a7b08e76a3bedfa32cc6d463d83a5ae32e38d6a242605a68e54d75cf11c3f3"} Apr 16 23:59:13.772940 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:13.772895 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" podStartSLOduration=1.772879653 podStartE2EDuration="1.772879653s" podCreationTimestamp="2026-04-16 23:59:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:59:13.770601834 +0000 UTC m=+534.545626720" watchObservedRunningTime="2026-04-16 23:59:13.772879653 +0000 UTC m=+534.547904538" Apr 16 23:59:14.340092 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:14.340062 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:14.345357 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:14.345336 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:14.756058 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:14.756028 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:14.756926 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:14.756906 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-m5tdn" Apr 16 23:59:25.682529 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.682496 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 16 23:59:25.726787 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.726759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 16 23:59:25.726787 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.726786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 16 23:59:25.726972 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.726874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.729211 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.729184 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 23:59:25.787069 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.787037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bcd634b-868f-4414-98fe-56cd74dd6898-config-file\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.787200 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.787094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mncck\" (UniqueName: \"kubernetes.io/projected/6bcd634b-868f-4414-98fe-56cd74dd6898-kube-api-access-mncck\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.887828 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.887795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mncck\" (UniqueName: \"kubernetes.io/projected/6bcd634b-868f-4414-98fe-56cd74dd6898-kube-api-access-mncck\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.887960 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.887903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bcd634b-868f-4414-98fe-56cd74dd6898-config-file\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.888451 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.888428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bcd634b-868f-4414-98fe-56cd74dd6898-config-file\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:25.895116 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:25.895099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mncck\" (UniqueName: \"kubernetes.io/projected/6bcd634b-868f-4414-98fe-56cd74dd6898-kube-api-access-mncck\") pod \"limitador-limitador-78c99df468-sdlmj\" (UID: \"6bcd634b-868f-4414-98fe-56cd74dd6898\") " pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:26.036993 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:26.036962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:26.165953 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:26.165924 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 16 23:59:26.167637 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:59:26.167609 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcd634b_868f_4414_98fe_56cd74dd6898.slice/crio-37b1d651e9538b091da64328c91195f65604059d1c39ce8e2c683228531d52ea WatchSource:0}: Error finding container 37b1d651e9538b091da64328c91195f65604059d1c39ce8e2c683228531d52ea: Status 404 returned error can't find the container with id 37b1d651e9538b091da64328c91195f65604059d1c39ce8e2c683228531d52ea Apr 16 23:59:26.805185 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:26.805148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" event={"ID":"6bcd634b-868f-4414-98fe-56cd74dd6898","Type":"ContainerStarted","Data":"37b1d651e9538b091da64328c91195f65604059d1c39ce8e2c683228531d52ea"} Apr 16 23:59:28.814602 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:28.814501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" event={"ID":"6bcd634b-868f-4414-98fe-56cd74dd6898","Type":"ContainerStarted","Data":"71930f5cd49034dc9a8dd812180ce219658343d85053d9b7a249db7cadc58cdb"} Apr 16 23:59:28.814602 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:28.814574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:28.831423 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:28.831369 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" podStartSLOduration=1.457816855 podStartE2EDuration="3.831357367s" podCreationTimestamp="2026-04-16 23:59:25 +0000 UTC" firstStartedPulling="2026-04-16 23:59:26.169576731 +0000 UTC m=+546.944601601" lastFinishedPulling="2026-04-16 23:59:28.54311725 +0000 UTC m=+549.318142113" observedRunningTime="2026-04-16 23:59:28.828128835 +0000 UTC m=+549.603153725" watchObservedRunningTime="2026-04-16 23:59:28.831357367 +0000 UTC m=+549.606382255" Apr 16 23:59:39.819634 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:39.819603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-sdlmj" Apr 16 23:59:56.811806 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.811750 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 16 23:59:56.817726 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.816882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:56.822037 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.821170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q5x47\"" Apr 16 23:59:56.822037 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.821876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 23:59:56.823053 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.822997 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 16 23:59:56.886277 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.886242 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 16 23:59:56.932949 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.932911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfb55\" (UniqueName: \"kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:56.933649 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:56.933626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.034681 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.034644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfb55\" (UniqueName: \"kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.034983 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.034962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.038192 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.038163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.044069 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.044019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfb55\" (UniqueName: \"kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55\") pod \"authorino-55d57b6655-mjn42\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.134767 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.134685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 16 23:59:57.296909 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.296885 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 16 23:59:57.298475 ip-10-0-134-103 kubenswrapper[2578]: W0416 23:59:57.298449 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9593254_0733_44c4_9f93_f0e123e8aff7.slice/crio-ebb7a8bbb12067b663f528de00fddbb3a1d8547a413591f99b1c41ed6e326bbc WatchSource:0}: Error finding container ebb7a8bbb12067b663f528de00fddbb3a1d8547a413591f99b1c41ed6e326bbc: Status 404 returned error can't find the container with id ebb7a8bbb12067b663f528de00fddbb3a1d8547a413591f99b1c41ed6e326bbc Apr 16 23:59:57.922844 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:57.922791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55d57b6655-mjn42" event={"ID":"c9593254-0733-44c4-9f93-f0e123e8aff7","Type":"ContainerStarted","Data":"ebb7a8bbb12067b663f528de00fddbb3a1d8547a413591f99b1c41ed6e326bbc"} Apr 16 23:59:59.949756 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:59.949672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55d57b6655-mjn42" event={"ID":"c9593254-0733-44c4-9f93-f0e123e8aff7","Type":"ContainerStarted","Data":"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885"} Apr 16 23:59:59.972445 ip-10-0-134-103 kubenswrapper[2578]: I0416 23:59:59.972394 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-55d57b6655-mjn42" podStartSLOduration=1.7321304149999999 podStartE2EDuration="3.972379331s" podCreationTimestamp="2026-04-16 23:59:56 +0000 UTC" firstStartedPulling="2026-04-16 23:59:57.2998747 +0000 UTC m=+578.074899564" lastFinishedPulling="2026-04-16 23:59:59.540123617 +0000 UTC m=+580.315148480" observedRunningTime="2026-04-16 23:59:59.971229588 +0000 UTC m=+580.746254478" watchObservedRunningTime="2026-04-16 23:59:59.972379331 +0000 UTC m=+580.747404264" Apr 17 00:00:19.772231 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:19.772206 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:00:19.772810 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:19.772655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:00:31.804691 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:31.804611 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:00:33.573855 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:33.573815 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:00:54.577741 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:54.577705 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:00:57.579509 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:00:57.579473 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:01:05.473909 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:01:05.473871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:02:06.445286 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.445211 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-86bc86554b-k4f7g"] Apr 17 00:02:06.448559 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.448529 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.456668 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.456644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86bc86554b-k4f7g"] Apr 17 00:02:06.464774 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.464750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7431161-4213-405c-9e8a-2ff6baa3b080-tls-cert\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.464875 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.464811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pb6v\" (UniqueName: \"kubernetes.io/projected/e7431161-4213-405c-9e8a-2ff6baa3b080-kube-api-access-5pb6v\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.566045 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.566014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7431161-4213-405c-9e8a-2ff6baa3b080-tls-cert\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.566154 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.566086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pb6v\" (UniqueName: \"kubernetes.io/projected/e7431161-4213-405c-9e8a-2ff6baa3b080-kube-api-access-5pb6v\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.568342 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.568320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7431161-4213-405c-9e8a-2ff6baa3b080-tls-cert\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.573100 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.573083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pb6v\" (UniqueName: \"kubernetes.io/projected/e7431161-4213-405c-9e8a-2ff6baa3b080-kube-api-access-5pb6v\") pod \"authorino-86bc86554b-k4f7g\" (UID: \"e7431161-4213-405c-9e8a-2ff6baa3b080\") " pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.759754 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.759730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86bc86554b-k4f7g" Apr 17 00:02:06.894976 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.891724 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86bc86554b-k4f7g"] Apr 17 00:02:06.895626 ip-10-0-134-103 kubenswrapper[2578]: W0417 00:02:06.895587 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7431161_4213_405c_9e8a_2ff6baa3b080.slice/crio-e371ae3cac078a1bc587a12fbbb056c8e2c9affcb782059cc0bcd03353a1de57 WatchSource:0}: Error finding container e371ae3cac078a1bc587a12fbbb056c8e2c9affcb782059cc0bcd03353a1de57: Status 404 returned error can't find the container with id e371ae3cac078a1bc587a12fbbb056c8e2c9affcb782059cc0bcd03353a1de57 Apr 17 00:02:06.897205 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:06.897185 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:02:07.452779 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.452736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86bc86554b-k4f7g" event={"ID":"e7431161-4213-405c-9e8a-2ff6baa3b080","Type":"ContainerStarted","Data":"8f166b0f4c365e956b6b6012c5dc99c0f94b174e1d2f173ef50c2da098a90bfd"} Apr 17 00:02:07.453186 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.452791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86bc86554b-k4f7g" event={"ID":"e7431161-4213-405c-9e8a-2ff6baa3b080","Type":"ContainerStarted","Data":"e371ae3cac078a1bc587a12fbbb056c8e2c9affcb782059cc0bcd03353a1de57"} Apr 17 00:02:07.467843 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.467800 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-86bc86554b-k4f7g" podStartSLOduration=1.035669368 podStartE2EDuration="1.467788037s" podCreationTimestamp="2026-04-17 00:02:06 +0000 UTC" firstStartedPulling="2026-04-17 00:02:06.897305182 +0000 UTC m=+707.672330046" lastFinishedPulling="2026-04-17 00:02:07.329423835 +0000 UTC m=+708.104448715" observedRunningTime="2026-04-17 00:02:07.466322475 +0000 UTC m=+708.241347362" watchObservedRunningTime="2026-04-17 00:02:07.467788037 +0000 UTC m=+708.242812922" Apr 17 00:02:07.493377 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.493350 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 17 00:02:07.493719 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.493586 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-55d57b6655-mjn42" podUID="c9593254-0733-44c4-9f93-f0e123e8aff7" containerName="authorino" containerID="cri-o://71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885" gracePeriod=30 Apr 17 00:02:07.756886 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.756862 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 17 00:02:07.777338 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.776429 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert\") pod \"c9593254-0733-44c4-9f93-f0e123e8aff7\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " Apr 17 00:02:07.777338 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.776563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfb55\" (UniqueName: \"kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55\") pod \"c9593254-0733-44c4-9f93-f0e123e8aff7\" (UID: \"c9593254-0733-44c4-9f93-f0e123e8aff7\") " Apr 17 00:02:07.779336 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.779304 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55" (OuterVolumeSpecName: "kube-api-access-gfb55") pod "c9593254-0733-44c4-9f93-f0e123e8aff7" (UID: "c9593254-0733-44c4-9f93-f0e123e8aff7"). InnerVolumeSpecName "kube-api-access-gfb55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:02:07.787763 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.787736 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "c9593254-0733-44c4-9f93-f0e123e8aff7" (UID: "c9593254-0733-44c4-9f93-f0e123e8aff7"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 00:02:07.877336 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.877262 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfb55\" (UniqueName: \"kubernetes.io/projected/c9593254-0733-44c4-9f93-f0e123e8aff7-kube-api-access-gfb55\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 17 00:02:07.877336 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:07.877288 2578 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c9593254-0733-44c4-9f93-f0e123e8aff7-tls-cert\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 17 00:02:08.457396 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.457360 2578 generic.go:358] "Generic (PLEG): container finished" podID="c9593254-0733-44c4-9f93-f0e123e8aff7" containerID="71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885" exitCode=0 Apr 17 00:02:08.457822 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.457443 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55d57b6655-mjn42" Apr 17 00:02:08.457822 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.457444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55d57b6655-mjn42" event={"ID":"c9593254-0733-44c4-9f93-f0e123e8aff7","Type":"ContainerDied","Data":"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885"} Apr 17 00:02:08.457822 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.457480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55d57b6655-mjn42" event={"ID":"c9593254-0733-44c4-9f93-f0e123e8aff7","Type":"ContainerDied","Data":"ebb7a8bbb12067b663f528de00fddbb3a1d8547a413591f99b1c41ed6e326bbc"} Apr 17 00:02:08.457822 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.457494 2578 scope.go:117] "RemoveContainer" containerID="71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885" Apr 17 00:02:08.466199 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.466182 2578 scope.go:117] "RemoveContainer" containerID="71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885" Apr 17 00:02:08.466448 ip-10-0-134-103 kubenswrapper[2578]: E0417 00:02:08.466428 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885\": container with ID starting with 71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885 not found: ID does not exist" containerID="71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885" Apr 17 00:02:08.466523 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.466458 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885"} err="failed to get container status \"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885\": rpc error: code = NotFound desc = could not find container \"71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885\": container with ID starting with 71b6f9a999dff949409b5c626e5d864c581ef982b221382534e02d067d100885 not found: ID does not exist" Apr 17 00:02:08.492481 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.492456 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 17 00:02:08.505933 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:08.505909 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-55d57b6655-mjn42"] Apr 17 00:02:09.818495 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:09.818461 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9593254-0733-44c4-9f93-f0e123e8aff7" path="/var/lib/kubelet/pods/c9593254-0733-44c4-9f93-f0e123e8aff7/volumes" Apr 17 00:02:33.775724 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:33.775686 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:02:44.286994 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:44.286958 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:02:53.278518 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:02:53.278481 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:03:03.583083 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:03:03.583046 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:03:12.877614 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:03:12.877582 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:03:22.479599 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:03:22.479561 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:04:25.472611 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:04:25.472577 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:04:39.685190 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:04:39.685156 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:05:19.690391 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:05:19.690309 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:05:19.805439 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:05:19.805412 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:05:19.805637 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:05:19.805622 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:05:36.188217 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:05:36.188186 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:05:50.288411 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:05:50.288374 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:06:05.884942 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:06:05.884901 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:07:00.178525 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:07:00.178488 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:07:09.687595 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:07:09.687558 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:07:26.573968 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:07:26.573930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:07:34.583636 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:07:34.583599 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:07:51.982515 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:07:51.982479 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:08:00.179656 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:08:00.179570 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:08:32.679376 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:08:32.679338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:08:40.774662 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:08:40.774628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:08:50.175636 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:08:50.175604 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:08:57.577206 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:08:57.577169 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:09:06.077557 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:09:06.077504 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:09:23.182805 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:09:23.182769 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:09:34.176860 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:09:34.176781 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:10:19.846273 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:19.846246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:10:19.846852 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:19.846626 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:10:20.876653 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:20.876611 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:10:28.974480 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:28.974445 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:10:37.570786 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:37.570750 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:10:46.780524 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:46.780482 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:10:56.076677 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:10:56.076647 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:04.177511 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:04.177432 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:13.281177 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:13.281142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:21.473376 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:21.473340 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:30.181282 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:30.181244 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:39.485372 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:39.485332 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:48.788337 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:48.788294 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:11:56.979854 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:11:56.979810 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:05.581460 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:05.581426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:13.582521 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:13.582485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:23.276243 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:23.276205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:31.574008 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:31.573930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:40.992499 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:40.992466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:48.488910 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:48.488872 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:12:58.972270 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:12:58.972224 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:13:03.179005 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:03.178970 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:13:35.480895 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:35.480858 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:13:40.283685 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:40.283647 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:13:42.449362 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:42.449322 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 17 00:13:42.449919 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:42.449642 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" podUID="66e05964-8792-440a-bf89-19d57677b3e9" containerName="manager" containerID="cri-o://6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07" gracePeriod=10 Apr 17 00:13:43.995588 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:43.995561 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 17 00:13:44.063237 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.063210 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxcb6\" (UniqueName: \"kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6\") pod \"66e05964-8792-440a-bf89-19d57677b3e9\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " Apr 17 00:13:44.063237 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.063245 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume\") pod \"66e05964-8792-440a-bf89-19d57677b3e9\" (UID: \"66e05964-8792-440a-bf89-19d57677b3e9\") " Apr 17 00:13:44.063639 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.063617 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "66e05964-8792-440a-bf89-19d57677b3e9" (UID: "66e05964-8792-440a-bf89-19d57677b3e9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 00:13:44.065184 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.065156 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6" (OuterVolumeSpecName: "kube-api-access-wxcb6") pod "66e05964-8792-440a-bf89-19d57677b3e9" (UID: "66e05964-8792-440a-bf89-19d57677b3e9"). InnerVolumeSpecName "kube-api-access-wxcb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:13:44.164791 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.164730 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxcb6\" (UniqueName: \"kubernetes.io/projected/66e05964-8792-440a-bf89-19d57677b3e9-kube-api-access-wxcb6\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 17 00:13:44.164791 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.164753 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66e05964-8792-440a-bf89-19d57677b3e9-extensions-socket-volume\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 17 00:13:44.214732 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.214704 2578 generic.go:358] "Generic (PLEG): container finished" podID="66e05964-8792-440a-bf89-19d57677b3e9" containerID="6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07" exitCode=0 Apr 17 00:13:44.214983 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.214768 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" Apr 17 00:13:44.214983 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.214788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" event={"ID":"66e05964-8792-440a-bf89-19d57677b3e9","Type":"ContainerDied","Data":"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07"} Apr 17 00:13:44.214983 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.214824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h" event={"ID":"66e05964-8792-440a-bf89-19d57677b3e9","Type":"ContainerDied","Data":"df6750d47a8576b31878426b341d598eec0224dd077b86f962d1843003f85359"} Apr 17 00:13:44.214983 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.214839 2578 scope.go:117] "RemoveContainer" containerID="6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07" Apr 17 00:13:44.224591 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.224576 2578 scope.go:117] "RemoveContainer" containerID="6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07" Apr 17 00:13:44.224815 ip-10-0-134-103 kubenswrapper[2578]: E0417 00:13:44.224795 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07\": container with ID starting with 6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07 not found: ID does not exist" containerID="6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07" Apr 17 00:13:44.224861 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.224823 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07"} err="failed to get container status \"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07\": rpc error: code = NotFound desc = could not find container \"6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07\": container with ID starting with 6a2d3a581f3053e25cbd6317458370faa27bbf6040ccbf80c44faef672742e07 not found: ID does not exist" Apr 17 00:13:44.236657 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.236634 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 17 00:13:44.242349 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:44.242330 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wxl4h"] Apr 17 00:13:45.818974 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:13:45.818943 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e05964-8792-440a-bf89-19d57677b3e9" path="/var/lib/kubelet/pods/66e05964-8792-440a-bf89-19d57677b3e9/volumes" Apr 17 00:14:48.538265 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538225 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq"] Apr 17 00:14:48.538707 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538623 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66e05964-8792-440a-bf89-19d57677b3e9" containerName="manager" Apr 17 00:14:48.538707 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538635 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e05964-8792-440a-bf89-19d57677b3e9" containerName="manager" Apr 17 00:14:48.538707 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538657 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9593254-0733-44c4-9f93-f0e123e8aff7" containerName="authorino" Apr 17 00:14:48.538707 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538664 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9593254-0733-44c4-9f93-f0e123e8aff7" containerName="authorino" Apr 17 00:14:48.538841 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538728 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="66e05964-8792-440a-bf89-19d57677b3e9" containerName="manager" Apr 17 00:14:48.538841 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.538739 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9593254-0733-44c4-9f93-f0e123e8aff7" containerName="authorino" Apr 17 00:14:48.541739 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.541721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.544173 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.544151 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rdzkz\"" Apr 17 00:14:48.550435 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.550410 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq"] Apr 17 00:14:48.585659 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.585630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2wm\" (UniqueName: \"kubernetes.io/projected/7fedda75-15e7-427e-9232-12f046b9e361-kube-api-access-7l2wm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.585763 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.585669 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fedda75-15e7-427e-9232-12f046b9e361-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.687076 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.687050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fedda75-15e7-427e-9232-12f046b9e361-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.687201 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.687179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2wm\" (UniqueName: \"kubernetes.io/projected/7fedda75-15e7-427e-9232-12f046b9e361-kube-api-access-7l2wm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.687496 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.687474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fedda75-15e7-427e-9232-12f046b9e361-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.698813 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.698791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2wm\" (UniqueName: \"kubernetes.io/projected/7fedda75-15e7-427e-9232-12f046b9e361-kube-api-access-7l2wm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tdrgq\" (UID: \"7fedda75-15e7-427e-9232-12f046b9e361\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:48.852387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:48.852333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:49.182501 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.182475 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq"] Apr 17 00:14:49.186403 ip-10-0-134-103 kubenswrapper[2578]: W0417 00:14:49.186371 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fedda75_15e7_427e_9232_12f046b9e361.slice/crio-ad3270c90e0d820452fb0e9c0ebb23f2b9c3e2cb7677f6a3dd134aff1f9dad4d WatchSource:0}: Error finding container ad3270c90e0d820452fb0e9c0ebb23f2b9c3e2cb7677f6a3dd134aff1f9dad4d: Status 404 returned error can't find the container with id ad3270c90e0d820452fb0e9c0ebb23f2b9c3e2cb7677f6a3dd134aff1f9dad4d Apr 17 00:14:49.188682 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.188659 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:14:49.476738 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.476645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" event={"ID":"7fedda75-15e7-427e-9232-12f046b9e361","Type":"ContainerStarted","Data":"a799e9fa6370a44cfa79ed89717284e3e9812f2a4de8aea863e46922742e18ca"} Apr 17 00:14:49.476738 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.476690 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" event={"ID":"7fedda75-15e7-427e-9232-12f046b9e361","Type":"ContainerStarted","Data":"ad3270c90e0d820452fb0e9c0ebb23f2b9c3e2cb7677f6a3dd134aff1f9dad4d"} Apr 17 00:14:49.476927 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.476741 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:14:49.495242 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:14:49.495189 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" podStartSLOduration=1.495177558 podStartE2EDuration="1.495177558s" podCreationTimestamp="2026-04-17 00:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:14:49.493397106 +0000 UTC m=+1470.268421992" watchObservedRunningTime="2026-04-17 00:14:49.495177558 +0000 UTC m=+1470.270202444" Apr 17 00:15:00.127660 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.127630 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:15:00.131295 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.131278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:15:00.134245 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.133896 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6fjhg\"" Apr 17 00:15:00.136781 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.136762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:15:00.178261 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.178232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8cq\" (UniqueName: \"kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq\") pod \"maas-api-key-cleanup-29606415-gmjmp\" (UID: \"480e3821-7f0d-4a84-85f2-63106dc0f5de\") " pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:15:00.279752 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.279725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8cq\" (UniqueName: \"kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq\") pod \"maas-api-key-cleanup-29606415-gmjmp\" (UID: \"480e3821-7f0d-4a84-85f2-63106dc0f5de\") " pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:15:00.287169 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.287141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8cq\" (UniqueName: \"kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq\") pod \"maas-api-key-cleanup-29606415-gmjmp\" (UID: \"480e3821-7f0d-4a84-85f2-63106dc0f5de\") " pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:15:00.442273 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.442209 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:15:00.483557 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.483513 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tdrgq" Apr 17 00:15:00.781130 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:00.781107 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:15:00.783346 ip-10-0-134-103 kubenswrapper[2578]: W0417 00:15:00.783315 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480e3821_7f0d_4a84_85f2_63106dc0f5de.slice/crio-1a9df929bc80319eb24a1f2ef1d64de3422063faa40ff3a87323f40ef97b45ab WatchSource:0}: Error finding container 1a9df929bc80319eb24a1f2ef1d64de3422063faa40ff3a87323f40ef97b45ab: Status 404 returned error can't find the container with id 1a9df929bc80319eb24a1f2ef1d64de3422063faa40ff3a87323f40ef97b45ab Apr 17 00:15:01.523552 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:01.523513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerStarted","Data":"1a9df929bc80319eb24a1f2ef1d64de3422063faa40ff3a87323f40ef97b45ab"} Apr 17 00:15:04.539460 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:04.539423 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerStarted","Data":"22504ae7ddabf7cf528399f2808a84ee2658b6466bcf60c35498858bb090448b"} Apr 17 00:15:04.553870 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:04.553820 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" podStartSLOduration=1.39345409 podStartE2EDuration="4.553807565s" podCreationTimestamp="2026-04-17 00:15:00 +0000 UTC" firstStartedPulling="2026-04-17 00:15:00.785494474 +0000 UTC m=+1481.560519338" lastFinishedPulling="2026-04-17 00:15:03.945847949 +0000 UTC m=+1484.720872813" observedRunningTime="2026-04-17 00:15:04.552953548 +0000 UTC m=+1485.327978434" watchObservedRunningTime="2026-04-17 00:15:04.553807565 +0000 UTC m=+1485.328832450" Apr 17 00:15:09.282754 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:09.282719 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:15:14.276746 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:14.276708 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:15:19.878460 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:19.878433 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:15:19.880128 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:19.880107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:15:24.620805 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:24.620777 2578 generic.go:358] "Generic (PLEG): container finished" podID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerID="22504ae7ddabf7cf528399f2808a84ee2658b6466bcf60c35498858bb090448b" exitCode=6 Apr 17 00:15:24.621120 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:24.620855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerDied","Data":"22504ae7ddabf7cf528399f2808a84ee2658b6466bcf60c35498858bb090448b"} Apr 17 00:15:24.621172 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:24.621159 2578 scope.go:117] "RemoveContainer" containerID="22504ae7ddabf7cf528399f2808a84ee2658b6466bcf60c35498858bb090448b" Apr 17 00:15:25.627696 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:25.627663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerStarted","Data":"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6"} Apr 17 00:15:39.177475 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:39.177437 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:15:43.481008 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:43.480968 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:15:45.713004 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:45.712970 2578 generic.go:358] "Generic (PLEG): container finished" podID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" exitCode=6 Apr 17 00:15:45.713408 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:45.713028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerDied","Data":"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6"} Apr 17 00:15:45.713408 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:45.713071 2578 scope.go:117] "RemoveContainer" containerID="22504ae7ddabf7cf528399f2808a84ee2658b6466bcf60c35498858bb090448b" Apr 17 00:15:45.713529 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:45.713428 2578 scope.go:117] "RemoveContainer" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" Apr 17 00:15:45.713684 ip-10-0-134-103 kubenswrapper[2578]: E0417 00:15:45.713663 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606415-gmjmp_opendatahub(480e3821-7f0d-4a84-85f2-63106dc0f5de)\"" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" Apr 17 00:15:53.379796 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:53.379757 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:15:59.816331 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:15:59.816300 2578 scope.go:117] "RemoveContainer" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" Apr 17 00:16:00.014664 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:00.014627 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:16:00.785034 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:00.784999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerStarted","Data":"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91"} Apr 17 00:16:00.785203 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:00.785058 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" containerID="cri-o://2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91" gracePeriod=30 Apr 17 00:16:04.182189 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:04.182150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:16:12.688375 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:12.688338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:16:20.737077 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.737054 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:16:20.853560 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.853468 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx8cq\" (UniqueName: \"kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq\") pod \"480e3821-7f0d-4a84-85f2-63106dc0f5de\" (UID: \"480e3821-7f0d-4a84-85f2-63106dc0f5de\") " Apr 17 00:16:20.855424 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.855392 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq" (OuterVolumeSpecName: "kube-api-access-jx8cq") pod "480e3821-7f0d-4a84-85f2-63106dc0f5de" (UID: "480e3821-7f0d-4a84-85f2-63106dc0f5de"). InnerVolumeSpecName "kube-api-access-jx8cq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:16:20.869906 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.869874 2578 generic.go:358] "Generic (PLEG): container finished" podID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerID="2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91" exitCode=6 Apr 17 00:16:20.870024 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.869938 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" Apr 17 00:16:20.870024 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.869953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerDied","Data":"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91"} Apr 17 00:16:20.870024 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.869995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606415-gmjmp" event={"ID":"480e3821-7f0d-4a84-85f2-63106dc0f5de","Type":"ContainerDied","Data":"1a9df929bc80319eb24a1f2ef1d64de3422063faa40ff3a87323f40ef97b45ab"} Apr 17 00:16:20.870024 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.870011 2578 scope.go:117] "RemoveContainer" containerID="2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91" Apr 17 00:16:20.886510 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.886491 2578 scope.go:117] "RemoveContainer" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" Apr 17 00:16:20.894504 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.894484 2578 scope.go:117] "RemoveContainer" containerID="2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91" Apr 17 00:16:20.894781 ip-10-0-134-103 kubenswrapper[2578]: E0417 00:16:20.894761 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91\": container with ID starting with 2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91 not found: ID does not exist" containerID="2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91" Apr 17 00:16:20.894845 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.894791 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91"} err="failed to get container status \"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91\": rpc error: code = NotFound desc = could not find container \"2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91\": container with ID starting with 2c01ed5799adb690cc3507bba62a76ce3d7d1cef387602126f130f0dec87cc91 not found: ID does not exist" Apr 17 00:16:20.894845 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.894810 2578 scope.go:117] "RemoveContainer" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" Apr 17 00:16:20.895077 ip-10-0-134-103 kubenswrapper[2578]: E0417 00:16:20.895059 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6\": container with ID starting with 50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6 not found: ID does not exist" containerID="50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6" Apr 17 00:16:20.895136 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.895088 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6"} err="failed to get container status \"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6\": rpc error: code = NotFound desc = could not find container \"50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6\": container with ID starting with 50e4275a4fa46958bd58346a59e70ef5461c83d8ce5f92c3b88f02d96fa95eb6 not found: ID does not exist" Apr 17 00:16:20.895877 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.895846 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:16:20.898188 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.898169 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606415-gmjmp"] Apr 17 00:16:20.954552 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:20.954514 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jx8cq\" (UniqueName: \"kubernetes.io/projected/480e3821-7f0d-4a84-85f2-63106dc0f5de-kube-api-access-jx8cq\") on node \"ip-10-0-134-103.ec2.internal\" DevicePath \"\"" Apr 17 00:16:21.818070 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:21.818038 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" path="/var/lib/kubelet/pods/480e3821-7f0d-4a84-85f2-63106dc0f5de/volumes" Apr 17 00:16:23.275387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:23.275350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:16:32.178840 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:32.178807 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:16:42.581975 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:42.581935 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:16:52.372807 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:16:52.372771 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:17:01.575865 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:17:01.575786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:17:10.477801 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:17:10.477762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:17:16.076760 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:17:16.076718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:17:44.373548 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:17:44.373509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:18:27.182473 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:18:27.182438 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:18:34.977899 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:18:34.977824 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:18:43.985781 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:18:43.985750 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:18:52.583856 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:18:52.583815 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:02.278191 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:02.278144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:12.484693 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:12.484659 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:20.278050 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:20.278013 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:27.579304 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:27.579265 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:37.780711 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:37.780676 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:45.377998 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:45.377962 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:19:54.175730 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:19:54.175690 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:20:06.975231 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:06.975191 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:20:19.909285 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:19.909256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:20:19.912111 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:19.912079 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:20:24.480659 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:24.480624 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:20:32.381107 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:32.381073 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:20:41.176672 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:41.176636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:20:48.877675 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:20:48.877638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:06.681260 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:06.681225 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:14.874337 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:14.874302 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:23.777724 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:23.777682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:32.280819 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:32.280741 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:41.377290 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:41.377259 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:49.782774 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:49.782736 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:21:58.785178 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:21:58.785147 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:11.376450 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:11.376418 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:21.275756 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:21.275716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:33.982000 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:33.981966 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:42.675967 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:42.675930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:51.573042 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:51.573001 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:22:59.286184 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:22:59.286144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:23:08.376734 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:23:08.376659 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:23:24.479162 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:23:24.479128 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:23:32.374709 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:23:32.374669 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:23:41.883400 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:23:41.883361 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:23:49.580492 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:23:49.580454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:24:13.377329 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:13.377293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:24:25.977597 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:25.977563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sdlmj"] Apr 17 00:24:27.673911 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:27.673883 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86bc86554b-k4f7g_e7431161-4213-405c-9e8a-2ff6baa3b080/authorino/0.log" Apr 17 00:24:32.118380 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:32.118307 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-s6vgg_e9fe26de-678e-4de8-902d-2994a78acd38/manager/0.log" Apr 17 00:24:33.003450 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.003418 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/util/0.log" Apr 17 00:24:33.010087 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.010061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/pull/0.log" Apr 17 00:24:33.016140 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.016116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/extract/0.log" Apr 17 00:24:33.125173 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.125146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/util/0.log" Apr 17 00:24:33.131268 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.131250 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/pull/0.log" Apr 17 00:24:33.137188 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.137171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/extract/0.log" Apr 17 00:24:33.246131 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.246097 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/util/0.log" Apr 17 00:24:33.252266 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.252243 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/pull/0.log" Apr 17 00:24:33.258088 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.258024 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/extract/0.log" Apr 17 00:24:33.365245 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.365216 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/extract/0.log" Apr 17 00:24:33.371020 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.370996 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/util/0.log" Apr 17 00:24:33.377009 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.376989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/pull/0.log" Apr 17 00:24:33.490826 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.490804 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86bc86554b-k4f7g_e7431161-4213-405c-9e8a-2ff6baa3b080/authorino/0.log" Apr 17 00:24:33.727424 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.727352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-x699g_d5e3eebd-0062-436a-b898-d09992976a88/manager/0.log" Apr 17 00:24:33.831903 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.831875 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ndqrc_0674c5b4-04dc-4b5c-9591-ea0023d88508/kuadrant-console-plugin/0.log" Apr 17 00:24:33.942790 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:33.942761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-kkrrx_0bd81371-b60d-4829-a5c0-c9284e76ace1/registry-server/0.log" Apr 17 00:24:34.060086 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:34.060057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tdrgq_7fedda75-15e7-427e-9232-12f046b9e361/manager/0.log" Apr 17 00:24:34.170047 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:34.170021 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-sdlmj_6bcd634b-868f-4414-98fe-56cd74dd6898/limitador/0.log" Apr 17 00:24:34.618452 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:34.618424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl_fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565/istio-proxy/0.log" Apr 17 00:24:35.058011 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:35.057981 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-m5tdn_5c243f98-0358-4d85-b9af-5d7dac8da24b/istio-proxy/0.log" Apr 17 00:24:35.167175 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:35.167151 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69ddbcffcb-4q64j_df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c/router/0.log" Apr 17 00:24:39.841020 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.840986 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vl859/must-gather-6cqjn"] Apr 17 00:24:39.841387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841336 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841347 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841361 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841387 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841367 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841519 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841421 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841519 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841429 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841519 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841499 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841519 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841504 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.841658 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.841573 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="480e3821-7f0d-4a84-85f2-63106dc0f5de" containerName="cleanup" Apr 17 00:24:39.844836 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.844819 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:39.847499 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.847474 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"kube-root-ca.crt\"" Apr 17 00:24:39.848469 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.848437 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"openshift-service-ca.crt\"" Apr 17 00:24:39.848640 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.848466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vl859\"/\"default-dockercfg-cqxln\"" Apr 17 00:24:39.857581 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.857557 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/must-gather-6cqjn"] Apr 17 00:24:39.900204 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.900166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-must-gather-output\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:39.900204 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:39.900207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2gm\" (UniqueName: \"kubernetes.io/projected/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-kube-api-access-xg2gm\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.001749 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.001702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-must-gather-output\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.001948 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.001757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2gm\" (UniqueName: \"kubernetes.io/projected/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-kube-api-access-xg2gm\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.002162 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.002137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-must-gather-output\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.010169 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.010139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2gm\" (UniqueName: \"kubernetes.io/projected/c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6-kube-api-access-xg2gm\") pod \"must-gather-6cqjn\" (UID: \"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6\") " pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.154689 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.154600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/must-gather-6cqjn" Apr 17 00:24:40.282227 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.282197 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/must-gather-6cqjn"] Apr 17 00:24:40.284141 ip-10-0-134-103 kubenswrapper[2578]: W0417 00:24:40.284109 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4937f46_cfe0_4e67_9cfe_5730f1c6f2c6.slice/crio-3ee2c587ae88e0139831f173905385acd2d578e6c05c9fab92ad9583cb97dc36 WatchSource:0}: Error finding container 3ee2c587ae88e0139831f173905385acd2d578e6c05c9fab92ad9583cb97dc36: Status 404 returned error can't find the container with id 3ee2c587ae88e0139831f173905385acd2d578e6c05c9fab92ad9583cb97dc36 Apr 17 00:24:40.286261 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.286245 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:24:40.835020 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:40.834971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/must-gather-6cqjn" event={"ID":"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6","Type":"ContainerStarted","Data":"3ee2c587ae88e0139831f173905385acd2d578e6c05c9fab92ad9583cb97dc36"} Apr 17 00:24:41.845898 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:41.845860 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/must-gather-6cqjn" event={"ID":"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6","Type":"ContainerStarted","Data":"f00a3f8082bcf1e6bebbc1665c841887d88f80beb85241c89caf39f3832c59cd"} Apr 17 00:24:41.845898 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:41.845903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/must-gather-6cqjn" event={"ID":"c4937f46-cfe0-4e67-9cfe-5730f1c6f2c6","Type":"ContainerStarted","Data":"fa9e17c42944d3891f4d4239f354daa0f4690ce7e6b2c90d85f8bc0132f8fecc"} Apr 17 00:24:41.862191 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:41.862123 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vl859/must-gather-6cqjn" podStartSLOduration=2.008596648 podStartE2EDuration="2.86210539s" podCreationTimestamp="2026-04-17 00:24:39 +0000 UTC" firstStartedPulling="2026-04-17 00:24:40.286378727 +0000 UTC m=+2061.061403591" lastFinishedPulling="2026-04-17 00:24:41.139887466 +0000 UTC m=+2061.914912333" observedRunningTime="2026-04-17 00:24:41.861099381 +0000 UTC m=+2062.636124281" watchObservedRunningTime="2026-04-17 00:24:41.86210539 +0000 UTC m=+2062.637130277" Apr 17 00:24:42.682604 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:42.682565 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gzjrj_4ec875de-2bac-4b6f-82a6-4e9a79ae830e/global-pull-secret-syncer/0.log" Apr 17 00:24:42.801813 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:42.801780 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fgzm6_5ecdc5db-c3e8-477a-8c96-bfa2a4fba192/konnectivity-agent/0.log" Apr 17 00:24:42.915787 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:42.915725 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-103.ec2.internal_136452127e3fd9ba2b831a29f8633a79/haproxy/0.log" Apr 17 00:24:46.870688 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:46.870654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/extract/0.log" Apr 17 00:24:46.893380 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:46.893345 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/util/0.log" Apr 17 00:24:46.916813 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:46.916775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592568g_fda81de8-7dbc-4bdb-8784-9999433285d0/pull/0.log" Apr 17 00:24:46.952864 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:46.952820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/extract/0.log" Apr 17 00:24:46.986555 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:46.986520 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/util/0.log" Apr 17 00:24:47.007635 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.007600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p675p_982e434d-93b4-4c88-bb51-fe62c1113690/pull/0.log" Apr 17 00:24:47.032417 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.032379 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/extract/0.log" Apr 17 00:24:47.054008 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.053977 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/util/0.log" Apr 17 00:24:47.078748 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.078719 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73g54nm_fe0bb47e-951d-4aff-8f6d-836c69c46190/pull/0.log" Apr 17 00:24:47.105805 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.105774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/extract/0.log" Apr 17 00:24:47.128603 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.128504 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/util/0.log" Apr 17 00:24:47.149262 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.149205 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1rc4jk_6d8b191d-1c87-4735-9354-a90a59a1b45b/pull/0.log" Apr 17 00:24:47.393290 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.393199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86bc86554b-k4f7g_e7431161-4213-405c-9e8a-2ff6baa3b080/authorino/0.log" Apr 17 00:24:47.439967 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.439940 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-x699g_d5e3eebd-0062-436a-b898-d09992976a88/manager/0.log" Apr 17 00:24:47.468581 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.468549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ndqrc_0674c5b4-04dc-4b5c-9591-ea0023d88508/kuadrant-console-plugin/0.log" Apr 17 00:24:47.507354 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.507327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-kkrrx_0bd81371-b60d-4829-a5c0-c9284e76ace1/registry-server/0.log" Apr 17 00:24:47.581449 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.581410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tdrgq_7fedda75-15e7-427e-9232-12f046b9e361/manager/0.log" Apr 17 00:24:47.600724 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:47.600692 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-sdlmj_6bcd634b-868f-4414-98fe-56cd74dd6898/limitador/0.log" Apr 17 00:24:49.139390 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.138807 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/alertmanager/0.log" Apr 17 00:24:49.171690 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.171659 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/config-reloader/0.log" Apr 17 00:24:49.193525 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.193492 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/kube-rbac-proxy-web/0.log" Apr 17 00:24:49.220766 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.220728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/kube-rbac-proxy/0.log" Apr 17 00:24:49.254455 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.254407 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/kube-rbac-proxy-metric/0.log" Apr 17 00:24:49.278560 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.278354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/prom-label-proxy/0.log" Apr 17 00:24:49.300133 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.299892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb2d28f9-8f9b-4833-b6b1-ff0e0e34458e/init-config-reloader/0.log" Apr 17 00:24:49.344847 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.344766 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-v78q2_675389d4-0616-4e3c-8d9d-a1d6f5247035/cluster-monitoring-operator/0.log" Apr 17 00:24:49.460254 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.460219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-68ff4cfd57-zmv8h_8e5ccfea-1770-41d3-bf4c-610f04b0b7e0/metrics-server/0.log" Apr 17 00:24:49.485747 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.485713 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-55j2w_1c04e984-f558-41cc-852c-fd03622e44c3/monitoring-plugin/0.log" Apr 17 00:24:49.596460 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.596322 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/node-exporter/0.log" Apr 17 00:24:49.618717 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.618688 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/kube-rbac-proxy/0.log" Apr 17 00:24:49.638144 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:49.638118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tjxxv_22ab6cc5-0d79-4ce3-9fee-cea0d3291ab2/init-textfile/0.log" Apr 17 00:24:50.078000 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:50.077956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-w4ff9_fe885835-2420-443f-9c28-6fae79714fb1/prometheus-operator-admission-webhook/0.log" Apr 17 00:24:51.326193 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.326127 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wwg7j_67418dd9-9c9a-4599-849e-9013809fd4d0/networking-console-plugin/0.log" Apr 17 00:24:51.621105 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.621026 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq"] Apr 17 00:24:51.627637 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.627595 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.634052 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.634017 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq"] Apr 17 00:24:51.726361 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.726307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-proc\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.726361 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.726357 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-sys\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.726664 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.726422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-podres\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.726664 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.726446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xcd8\" (UniqueName: \"kubernetes.io/projected/343a5046-bbb1-4ba4-9b56-d8b6b7313522-kube-api-access-2xcd8\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.726664 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.726514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-lib-modules\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.827930 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.827896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-podres\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828117 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.827938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xcd8\" (UniqueName: \"kubernetes.io/projected/343a5046-bbb1-4ba4-9b56-d8b6b7313522-kube-api-access-2xcd8\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828117 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-lib-modules\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828117 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-proc\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828117 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-sys\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828117 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-podres\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828347 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-sys\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828347 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-proc\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.828347 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.828223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/343a5046-bbb1-4ba4-9b56-d8b6b7313522-lib-modules\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.837735 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.837705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xcd8\" (UniqueName: \"kubernetes.io/projected/343a5046-bbb1-4ba4-9b56-d8b6b7313522-kube-api-access-2xcd8\") pod \"perf-node-gather-daemonset-fwhnq\" (UID: \"343a5046-bbb1-4ba4-9b56-d8b6b7313522\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:51.943270 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:51.943189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:52.099587 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.099560 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq"] Apr 17 00:24:52.101215 ip-10-0-134-103 kubenswrapper[2578]: W0417 00:24:52.101187 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod343a5046_bbb1_4ba4_9b56_d8b6b7313522.slice/crio-88fa2713389b5e0b1a30f87d79bbe59a830b68bd0943e0348b5bda20e5c68c02 WatchSource:0}: Error finding container 88fa2713389b5e0b1a30f87d79bbe59a830b68bd0943e0348b5bda20e5c68c02: Status 404 returned error can't find the container with id 88fa2713389b5e0b1a30f87d79bbe59a830b68bd0943e0348b5bda20e5c68c02 Apr 17 00:24:52.372673 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.372501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679b9d47f4-9744s_cf0b0aac-d126-45b6-b3a3-d222909d94ec/console/0.log" Apr 17 00:24:52.404242 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.404207 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-88kbv_e3a85d1e-0965-4d09-95ed-fe07834583c7/download-server/0.log" Apr 17 00:24:52.895478 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.895453 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-w8z9z_222e7080-b3ea-4699-8dc0-7f118d6c305f/volume-data-source-validator/0.log" Apr 17 00:24:52.906003 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.905958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" event={"ID":"343a5046-bbb1-4ba4-9b56-d8b6b7313522","Type":"ContainerStarted","Data":"974afbd88fbbfb7e595b63d72a04b1ea565bc5f8fb3c88f2b635eeff59c2ee33"} Apr 17 00:24:52.906148 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.906014 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" event={"ID":"343a5046-bbb1-4ba4-9b56-d8b6b7313522","Type":"ContainerStarted","Data":"88fa2713389b5e0b1a30f87d79bbe59a830b68bd0943e0348b5bda20e5c68c02"} Apr 17 00:24:52.906148 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.906051 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:52.922916 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:52.922861 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" podStartSLOduration=1.922842203 podStartE2EDuration="1.922842203s" podCreationTimestamp="2026-04-17 00:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:24:52.920335505 +0000 UTC m=+2073.695360393" watchObservedRunningTime="2026-04-17 00:24:52.922842203 +0000 UTC m=+2073.697867090" Apr 17 00:24:53.724774 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:53.724737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcndm_2e003096-f002-43cb-9237-3811ca14f285/dns/0.log" Apr 17 00:24:53.742903 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:53.742868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcndm_2e003096-f002-43cb-9237-3811ca14f285/kube-rbac-proxy/0.log" Apr 17 00:24:53.807731 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:53.807704 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xjgf6_b3dc2fa1-d372-4847-9980-2930ef815461/dns-node-resolver/0.log" Apr 17 00:24:54.277711 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:54.277673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7748d6467c-lnj85_24b53389-f90a-49e4-bddc-da64abb7be4d/registry/0.log" Apr 17 00:24:54.318055 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:54.318025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxjsh_4ad19b4d-6e37-4bb2-adb6-743cb3d95223/node-ca/0.log" Apr 17 00:24:55.133816 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:55.133788 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf92pnl_fef97fb8-ba0f-4dbc-acf3-42c5bcdf9565/istio-proxy/0.log" Apr 17 00:24:55.326113 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:55.326081 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-m5tdn_5c243f98-0358-4d85-b9af-5d7dac8da24b/istio-proxy/0.log" Apr 17 00:24:55.346784 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:55.346759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69ddbcffcb-4q64j_df3cf5af-f3bd-4f12-a5b0-d3f03341ef5c/router/0.log" Apr 17 00:24:55.907758 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:55.907715 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w4vbz_48793279-1866-40db-8e3c-e2c46e4d6f6d/serve-healthcheck-canary/0.log" Apr 17 00:24:56.470414 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:56.470383 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c8z7v_86a1f3db-caf2-421b-8605-dd1617f1cc05/kube-rbac-proxy/0.log" Apr 17 00:24:56.487097 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:56.487072 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c8z7v_86a1f3db-caf2-421b-8605-dd1617f1cc05/exporter/0.log" Apr 17 00:24:56.505332 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:56.505305 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c8z7v_86a1f3db-caf2-421b-8605-dd1617f1cc05/extractor/0.log" Apr 17 00:24:58.564384 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:58.564350 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-s6vgg_e9fe26de-678e-4de8-902d-2994a78acd38/manager/0.log" Apr 17 00:24:58.922665 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:58.922574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-fwhnq" Apr 17 00:24:59.707643 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:24:59.707613 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b6988ccb7-wtdpc_27662c6d-0432-45b9-82a9-1c1c2dbd3fa5/manager/0.log" Apr 17 00:25:05.608687 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.608654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2frtz_3cda9bf7-a2b5-4873-acdd-b7f1f28e5295/kube-multus/0.log" Apr 17 00:25:05.634328 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.634305 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/kube-multus-additional-cni-plugins/0.log" Apr 17 00:25:05.651741 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.651711 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/egress-router-binary-copy/0.log" Apr 17 00:25:05.669617 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.669593 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/cni-plugins/0.log" Apr 17 00:25:05.687137 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.687111 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/bond-cni-plugin/0.log" Apr 17 00:25:05.709422 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.709388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/routeoverride-cni/0.log" Apr 17 00:25:05.726941 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.726856 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/whereabouts-cni-bincopy/0.log" Apr 17 00:25:05.750596 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:05.750568 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5sk6r_6204e842-fd30-4eb6-be92-04b4429887c1/whereabouts-cni/0.log" Apr 17 00:25:06.128488 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:06.128452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4sczf_bac2109e-d2f6-42aa-94c6-73a79a2012f0/network-metrics-daemon/0.log" Apr 17 00:25:06.143797 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:06.143772 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4sczf_bac2109e-d2f6-42aa-94c6-73a79a2012f0/kube-rbac-proxy/0.log" Apr 17 00:25:07.515456 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.515429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-controller/0.log" Apr 17 00:25:07.529999 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.529966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/0.log" Apr 17 00:25:07.544490 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.544461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovn-acl-logging/1.log" Apr 17 00:25:07.562416 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.562391 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/kube-rbac-proxy-node/0.log" Apr 17 00:25:07.579814 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.579793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:25:07.595301 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.595281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/northd/0.log" Apr 17 00:25:07.611950 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.611933 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/nbdb/0.log" Apr 17 00:25:07.629887 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.629867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/sbdb/0.log" Apr 17 00:25:07.743818 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:07.743789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r4p9f_66b93172-3f69-4425-8a38-ff386fb3d1dc/ovnkube-controller/0.log" Apr 17 00:25:08.951244 ip-10-0-134-103 kubenswrapper[2578]: I0417 00:25:08.951203 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-hz2rt_fb08396b-4c9e-4eca-b73b-579e76eaebc7/check-endpoints/0.log"