Apr 20 07:00:26.397208 ip-10-0-142-100 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 07:00:26.397239 ip-10-0-142-100 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 07:00:26.397249 ip-10-0-142-100 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 07:00:26.397577 ip-10-0-142-100 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 07:00:36.539686 ip-10-0-142-100 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 07:00:36.539703 ip-10-0-142-100 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 66083a159c77494aa7be2300aa04290f -- Apr 20 07:02:37.220514 ip-10-0-142-100 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:02:37.726472 ip-10-0-142-100 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:37.726472 ip-10-0-142-100 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:02:37.726472 ip-10-0-142-100 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:37.726472 ip-10-0-142-100 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:02:37.726472 ip-10-0-142-100 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:37.728352 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.728265 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:02:37.730744 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730726 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730745 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730748 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730752 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730754 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730757 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730760 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730763 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730765 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730768 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730771 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730773 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730776 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730778 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730782 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730785 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:37.730780 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730788 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730791 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730803 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730806 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730808 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730811 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730813 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730816 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730818 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730821 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730824 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730826 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730830 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730832 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730834 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730837 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730839 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730842 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730844 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730846 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:37.731212 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730849 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730851 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730853 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730856 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730858 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730861 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730864 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730867 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730870 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730873 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730877 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730881 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730883 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730886 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730889 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730891 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730894 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730898 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730901 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:37.731698 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730904 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730906 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730909 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730912 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730915 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730918 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730921 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730923 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730926 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730929 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730932 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730935 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730938 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730940 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730943 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730946 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730948 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730951 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730954 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:37.732207 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730956 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730959 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730961 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730963 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730966 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730969 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730971 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730973 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730976 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730978 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730981 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.730983 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731425 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731432 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731435 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731438 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731441 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731443 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731446 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731448 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:37.732663 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731451 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731454 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731456 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731459 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731463 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731466 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731468 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731471 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731473 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731475 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731478 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731481 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731484 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731487 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731489 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731492 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731495 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731497 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731500 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731502 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:37.733170 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731505 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731507 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731510 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731513 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731515 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731518 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731520 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731522 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731525 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731527 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731530 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731533 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731535 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731537 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731540 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731542 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731546 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731550 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731553 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731555 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:37.733687 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731558 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731561 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731564 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731567 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731569 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731572 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731574 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731577 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731580 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731582 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731586 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731589 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731592 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731595 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731598 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731601 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731603 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731606 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731608 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:37.734185 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731611 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731613 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731616 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731619 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731621 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731623 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731626 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731628 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731630 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731633 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731636 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731639 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731641 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731644 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731646 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731649 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731652 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731654 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.731657 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731735 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:02:37.734646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731743 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731750 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731754 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731759 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731762 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731767 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731771 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731774 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731777 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731781 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731784 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731787 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731790 2566 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731793 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731796 2566 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731800 2566 flags.go:64] FLAG: --cloud-config="" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731803 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731806 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731810 2566 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731813 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731816 2566 flags.go:64] FLAG: --config-dir="" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731818 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731821 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731825 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:02:37.735150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731829 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731833 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731836 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731839 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731842 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731845 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731848 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731851 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731855 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731858 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731861 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731864 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731867 2566 flags.go:64] FLAG: --enable-server="true" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731870 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731875 2566 flags.go:64] FLAG: --event-burst="100" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731878 2566 flags.go:64] FLAG: --event-qps="50" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731881 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731884 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731887 2566 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731891 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731893 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731896 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731899 2566 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731903 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731905 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:02:37.735811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731908 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731911 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731914 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731917 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731919 2566 flags.go:64] FLAG: --feature-gates="" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731923 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731926 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731929 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731933 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731936 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731939 2566 flags.go:64] FLAG: --help="false" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731942 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-142-100.ec2.internal" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731945 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731948 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731951 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731955 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731958 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731961 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731964 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731967 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731972 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731975 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731978 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731981 2566 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:02:37.736431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731984 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731986 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731989 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731992 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731995 2566 flags.go:64] FLAG: --lock-file="" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.731997 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732000 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732003 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732008 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732011 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732014 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732017 2566 flags.go:64] FLAG: --logging-format="text" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732020 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732023 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732026 2566 flags.go:64] FLAG: --manifest-url="" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732028 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732033 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732036 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732041 2566 flags.go:64] FLAG: --max-pods="110" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732044 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732047 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732050 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732053 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732071 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732074 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:02:37.737022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732077 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732085 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732088 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732092 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732095 2566 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732098 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732103 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732106 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732109 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732112 2566 flags.go:64] FLAG: --port="10250" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732115 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732118 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cd3eb80153577ef6" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732121 2566 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732124 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732127 2566 flags.go:64] FLAG: --register-node="true" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732130 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732133 2566 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732136 2566 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732139 2566 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732141 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732144 2566 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732148 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732151 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732154 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732156 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:02:37.737638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732162 2566 flags.go:64] FLAG: --runonce="false" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732165 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732167 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732170 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732173 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732176 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732179 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732182 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732185 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732188 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732192 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732194 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732197 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732200 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732203 2566 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732206 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732213 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732216 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732219 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732223 2566 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732225 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732228 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732231 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732234 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732237 2566 flags.go:64] FLAG: --v="2" Apr 20 07:02:37.738240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732241 2566 flags.go:64] FLAG: --version="false" Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732245 2566 flags.go:64] FLAG: --vmodule="" Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732249 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.732253 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732345 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732349 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732352 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732356 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732364 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732366 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732369 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732371 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732374 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732376 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732379 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732382 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732384 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732388 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732390 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732393 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:37.738838 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732396 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732398 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732401 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732403 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732406 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732408 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732411 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732414 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732416 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732419 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732421 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732423 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732426 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732428 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732431 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732433 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732436 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732438 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732441 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732444 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:37.739386 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732447 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732450 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732452 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732455 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732458 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732460 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732462 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732465 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732467 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732471 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732474 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732477 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732479 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732481 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732484 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732486 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732489 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732491 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732494 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732496 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:37.739898 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732499 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732501 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732504 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732508 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732512 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732516 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732519 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732522 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732524 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732527 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732530 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732533 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732536 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732538 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732541 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732543 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732546 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732549 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:37.740405 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732551 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732554 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732556 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732560 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732562 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732564 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732567 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732570 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732572 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732574 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732577 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.732579 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.733621 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.740386 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.740403 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740453 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:37.740847 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740459 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740462 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740465 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740468 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740470 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740473 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740476 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740479 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740482 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740484 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740487 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740490 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740493 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740495 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740499 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740502 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740504 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740507 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740509 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740512 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:37.741279 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740515 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740517 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740520 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740523 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740527 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740530 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740533 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740535 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740538 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740540 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740544 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740547 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740550 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740553 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740555 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740557 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740560 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740563 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740566 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740568 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:37.741776 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740571 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740573 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740576 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740578 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740581 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740583 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740585 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740588 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740590 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740593 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740595 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740597 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740600 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740603 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740606 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740608 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740611 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740615 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740619 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740621 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:37.742338 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740624 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740628 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740631 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740641 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740644 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740647 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740649 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740652 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740655 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740657 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740660 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740663 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740665 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740667 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740670 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740672 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740675 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740677 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740679 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740682 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:37.742855 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740684 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740687 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740689 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740692 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740694 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.740699 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740822 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740828 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740831 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740834 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740837 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740840 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740843 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740845 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:37.743342 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740848 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740850 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740870 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740874 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740876 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740879 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740881 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740884 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740886 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740889 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740891 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740894 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740900 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740903 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740905 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740908 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740910 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740912 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740915 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:37.743684 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740917 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740920 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740922 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740924 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740926 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740930 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740932 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740934 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740937 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740939 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740942 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740945 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740947 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740950 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740952 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740955 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740965 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740967 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740970 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740972 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:37.744169 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740974 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740977 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740979 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740982 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740984 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740986 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740989 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740992 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740994 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740997 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.740999 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741002 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741004 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741007 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741009 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741011 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741014 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741017 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741020 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741024 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:37.744650 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741026 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741029 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741031 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741034 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741036 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741039 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741041 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741043 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741045 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741053 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741071 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741074 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741076 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741079 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741081 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741084 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741086 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741089 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:37.745161 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:37.741092 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:37.745598 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.741096 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:37.745598 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.741894 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:02:37.746869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.746854 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:02:37.747946 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.747935 2566 server.go:1019] "Starting client certificate rotation" Apr 20 07:02:37.748049 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.748032 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:37.748688 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.748677 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:37.783562 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.783523 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:37.787115 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.787089 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:37.803254 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.803224 2566 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:02:37.810210 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.810190 2566 log.go:25] "Validated CRI v1 image API" Apr 20 07:02:37.811533 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.811516 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:02:37.820815 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.820789 2566 fs.go:135] Filesystem UUIDs: map[41d8ca29-fd1b-46e3-87ab-81b2b396ab5d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d807292e-47a9-423b-b1d1-d71f92c255f3:/dev/nvme0n1p4] Apr 20 07:02:37.820888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.820815 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:02:37.821034 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.821018 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:37.826975 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.826867 2566 manager.go:217] Machine: {Timestamp:2026-04-20 07:02:37.824623383 +0000 UTC m=+0.467932525 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102356 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26a3ea5cbb65e9fa0abae976bf37e3 SystemUUID:ec26a3ea-5cbb-65e9-fa0a-bae976bf37e3 BootID:66083a15-9c77-494a-a7be-2300aa04290f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7c:61:c8:7b:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7c:61:c8:7b:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:29:f0:67:dc:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:02:37.826975 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.826971 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:02:37.827087 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.827051 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:02:37.828290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.828263 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:02:37.828426 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.828292 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-100.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:02:37.828473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.828436 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:02:37.828473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.828445 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:02:37.828473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.828457 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:37.829367 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.829357 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:37.830603 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.830593 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:37.830708 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.830700 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:02:37.833493 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.833482 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:02:37.833541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.833496 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:02:37.833541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.833512 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:02:37.833541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.833521 2566 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:02:37.833541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.833531 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:02:37.834761 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.834747 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:37.834802 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.834775 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:37.839913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.839887 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:02:37.843686 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.843668 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:02:37.845721 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845693 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845725 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845737 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845744 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845751 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845757 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845763 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845768 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845776 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845782 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845790 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:02:37.845807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.845798 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:02:37.846821 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.846810 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:02:37.846821 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.846822 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:02:37.847581 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.847549 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-100.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 07:02:37.847645 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.847621 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 07:02:37.850768 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.850754 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:02:37.850842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.850794 2566 server.go:1295] "Started kubelet" Apr 20 07:02:37.850896 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.850871 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:02:37.851004 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.850967 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:02:37.851033 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.851027 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:02:37.851631 ip-10-0-142-100 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:02:37.852576 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.852450 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:02:37.853858 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.853841 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:02:37.860648 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.860627 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:37.860832 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.860815 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-100.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 07:02:37.861221 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.861206 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:02:37.861806 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.860656 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-100.ec2.internal.18a7fea3e24cf210 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-100.ec2.internal,UID:ip-10-0-142-100.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-100.ec2.internal,},FirstTimestamp:2026-04-20 07:02:37.850767888 +0000 UTC m=+0.494077029,LastTimestamp:2026-04-20 07:02:37.850767888 +0000 UTC m=+0.494077029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-100.ec2.internal,}" Apr 20 07:02:37.862092 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862047 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:02:37.862092 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862086 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:02:37.862202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862187 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:02:37.862272 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.862248 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:37.862367 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862264 2566 factory.go:153] Registering CRI-O factory Apr 20 07:02:37.862419 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862380 2566 factory.go:223] Registration of the crio container factory successfully Apr 20 07:02:37.862472 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862443 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:02:37.862472 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862453 2566 factory.go:55] Registering systemd factory Apr 20 07:02:37.862472 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862461 2566 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:02:37.862599 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862482 2566 factory.go:103] Registering Raw factory Apr 20 07:02:37.862599 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862497 2566 manager.go:1196] Started watching for new ooms in manager Apr 20 07:02:37.862599 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862302 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:02:37.862599 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862529 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:02:37.862917 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.862905 2566 manager.go:319] Starting recovery of all containers Apr 20 07:02:37.865481 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.865459 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:02:37.867846 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.867666 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-100.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 07:02:37.867930 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.867892 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 07:02:37.871302 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.871281 2566 manager.go:324] Recovery completed Apr 20 07:02:37.878570 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.878555 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:37.881214 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881198 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:37.881274 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881229 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:37.881274 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881238 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:37.881759 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881742 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:02:37.881759 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881757 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:02:37.881859 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.881771 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:37.883581 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.883519 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-100.ec2.internal.18a7fea3e41d8c6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-100.ec2.internal,UID:ip-10-0-142-100.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-100.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-100.ec2.internal,},FirstTimestamp:2026-04-20 07:02:37.881216111 +0000 UTC m=+0.524525252,LastTimestamp:2026-04-20 07:02:37.881216111 +0000 UTC m=+0.524525252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-100.ec2.internal,}" Apr 20 07:02:37.885727 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.885713 2566 policy_none.go:49] "None policy: Start" Apr 20 07:02:37.885788 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.885730 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:02:37.885788 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.885740 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:02:37.894876 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.894815 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-100.ec2.internal.18a7fea3e41dcfba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-100.ec2.internal,UID:ip-10-0-142-100.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-142-100.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-142-100.ec2.internal,},FirstTimestamp:2026-04-20 07:02:37.881233338 +0000 UTC m=+0.524542479,LastTimestamp:2026-04-20 07:02:37.881233338 +0000 UTC m=+0.524542479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-100.ec2.internal,}" Apr 20 07:02:37.899791 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.899769 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mlkqz" Apr 20 07:02:37.907958 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.907939 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mlkqz" Apr 20 07:02:37.912150 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.912041 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-100.ec2.internal.18a7fea3e41df241 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-100.ec2.internal,UID:ip-10-0-142-100.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-142-100.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-142-100.ec2.internal,},FirstTimestamp:2026-04-20 07:02:37.881242177 +0000 UTC m=+0.524551318,LastTimestamp:2026-04-20 07:02:37.881242177 +0000 UTC m=+0.524551318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-100.ec2.internal,}" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.931649 2566 manager.go:341] "Starting Device Plugin manager" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.931679 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.931689 2566 server.go:85] "Starting device plugin registration server" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.931933 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.931943 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.932037 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.932130 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.932139 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.932759 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.932801 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.933107 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.934662 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.934694 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.934716 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.934726 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:02:37.945520 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:37.934766 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:02:37.946095 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:37.946081 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:38.032605 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.032581 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:38.033752 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.033735 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:38.033840 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.033769 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:38.033840 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.033781 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:38.033840 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.033810 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.034830 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.034806 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal"] Apr 20 07:02:38.034915 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.034871 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:38.035829 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.035808 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:38.035913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.035836 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:38.035913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.035850 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:38.038037 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.038025 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:38.038214 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.038200 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.038251 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.038232 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:38.039397 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039377 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:38.039491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039409 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:38.039491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039422 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:38.039491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039423 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:38.039491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039440 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:38.039491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.039451 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:38.041540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.041524 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.041633 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.041548 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:38.042211 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.042195 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:38.042277 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.042225 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:38.042277 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.042239 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:38.044219 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.044204 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.044290 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.044221 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-100.ec2.internal\": node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.055616 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.055589 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-100.ec2.internal\" not found" node="ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.058968 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.058951 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-100.ec2.internal\" not found" node="ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.061753 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.061739 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.063587 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.063572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06a23782138e9e64504f2d8b6d92effd-config\") pod \"kube-apiserver-proxy-ip-10-0-142-100.ec2.internal\" (UID: \"06a23782138e9e64504f2d8b6d92effd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.063632 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.063596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.063632 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.063613 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.161847 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.161819 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.164099 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.164151 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164109 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.164151 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06a23782138e9e64504f2d8b6d92effd-config\") pod \"kube-apiserver-proxy-ip-10-0-142-100.ec2.internal\" (UID: \"06a23782138e9e64504f2d8b6d92effd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.164211 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164165 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06a23782138e9e64504f2d8b6d92effd-config\") pod \"kube-apiserver-proxy-ip-10-0-142-100.ec2.internal\" (UID: \"06a23782138e9e64504f2d8b6d92effd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.164211 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164195 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.164273 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.164198 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f16f90fea78edadec0b74bae56286c52-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal\" (UID: \"f16f90fea78edadec0b74bae56286c52\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.262469 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.262434 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.358430 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.358368 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.362332 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.362308 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.363454 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.363439 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.463929 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.463894 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.564370 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.564344 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.664974 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.664892 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.747210 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.747172 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:02:38.747827 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.747333 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:38.765484 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:38.765461 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-100.ec2.internal\" not found" Apr 20 07:02:38.836136 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.836105 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:38.861805 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.861768 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:38.861968 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.861785 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.878054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.878023 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:38.881293 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.881273 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:38.883299 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.883280 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" Apr 20 07:02:38.890888 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:38.890857 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf16f90fea78edadec0b74bae56286c52.slice/crio-efec29b6dca844460376c93ffd9c2dec05429a97adbaec79d4534a7b3ab36730 WatchSource:0}: Error finding container efec29b6dca844460376c93ffd9c2dec05429a97adbaec79d4534a7b3ab36730: Status 404 returned error can't find the container with id efec29b6dca844460376c93ffd9c2dec05429a97adbaec79d4534a7b3ab36730 Apr 20 07:02:38.891145 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:38.891125 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a23782138e9e64504f2d8b6d92effd.slice/crio-e5216c29c45a24b1418849b188965ea98d16f2dbe10c3f20411b0b0a1611d954 WatchSource:0}: Error finding container e5216c29c45a24b1418849b188965ea98d16f2dbe10c3f20411b0b0a1611d954: Status 404 returned error can't find the container with id e5216c29c45a24b1418849b188965ea98d16f2dbe10c3f20411b0b0a1611d954 Apr 20 07:02:38.898379 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.898361 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:02:38.900508 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.900492 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:38.910413 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.910386 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8bvmp" Apr 20 07:02:38.911440 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.911413 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 06:57:37 +0000 UTC" deadline="2027-10-09 09:18:11.758291306 +0000 UTC" Apr 20 07:02:38.911488 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.911441 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12890h15m32.846853134s" Apr 20 07:02:38.923189 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.923118 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8bvmp" Apr 20 07:02:38.937973 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.937916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" event={"ID":"06a23782138e9e64504f2d8b6d92effd","Type":"ContainerStarted","Data":"e5216c29c45a24b1418849b188965ea98d16f2dbe10c3f20411b0b0a1611d954"} Apr 20 07:02:38.938842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:38.938821 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" event={"ID":"f16f90fea78edadec0b74bae56286c52","Type":"ContainerStarted","Data":"efec29b6dca844460376c93ffd9c2dec05429a97adbaec79d4534a7b3ab36730"} Apr 20 07:02:39.034094 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.034044 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:39.392207 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.392009 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:39.834735 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.834698 2566 apiserver.go:52] "Watching apiserver" Apr 20 07:02:39.846970 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.846939 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:02:39.848051 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.848024 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-677wv","kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal","openshift-cluster-node-tuning-operator/tuned-kvspn","openshift-image-registry/node-ca-pqdp9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal","openshift-multus/multus-additional-cni-plugins-vtmbx","openshift-multus/network-metrics-daemon-h8v9g","openshift-network-diagnostics/network-check-target-7cnrw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4","openshift-multus/multus-c66hd","openshift-network-operator/iptables-alerter-txpkt","openshift-ovn-kubernetes/ovnkube-node-vz4hl"] Apr 20 07:02:39.853520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.853491 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.853660 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.853578 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.855945 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.855921 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.857848 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.857822 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:02:39.857994 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.857972 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.858086 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.858003 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:02:39.858243 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.858225 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.859530 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.859120 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.859530 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.859162 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8kc5w\"" Apr 20 07:02:39.859530 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.859253 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vkbg7\"" Apr 20 07:02:39.860386 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.860364 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.860495 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.860447 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:39.861290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.861271 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:02:39.862352 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.862330 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.862431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.862382 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:02:39.862625 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.862599 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:02:39.862696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.862634 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fwvms\"" Apr 20 07:02:39.862749 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.862639 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.863096 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.863078 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:39.863194 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.863141 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:39.863254 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.863239 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.863733 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.863691 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.863831 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.863770 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:02:39.863831 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.863814 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8qzzl\"" Apr 20 07:02:39.865375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.865355 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.867598 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.867577 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.868247 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.868228 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.868451 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.868431 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:02:39.868890 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.868869 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.868975 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.868942 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wj9m6\"" Apr 20 07:02:39.870150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.870111 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:39.870244 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.870207 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:02:39.870450 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.870427 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xqsng\"" Apr 20 07:02:39.872324 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872306 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.872414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-kubernetes\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872374 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-lib-modules\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872404 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-var-lib-kubelet\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872438 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.872574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-etc-tuned\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872519 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdl7\" (UniqueName: \"kubernetes.io/projected/ade86a8b-7468-4238-b3ff-728e885f0d78-kube-api-access-jjdl7\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872595 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-modprobe-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872623 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-systemd\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-os-release\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4151df80-e101-4d3f-a89d-f0d2c0217a27-host\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872711 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872737 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dzb\" (UniqueName: \"kubernetes.io/projected/fc07195a-cdd1-494d-b741-97e9b77b3f6d-kube-api-access-l8dzb\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.872918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872760 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.872918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3d30794-2dde-421b-92b3-bb5c7855130c-agent-certs\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.872918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872828 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3d30794-2dde-421b-92b3-bb5c7855130c-konnectivity-ca\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.872918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872851 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-conf\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.872918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-sys\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873079 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.872939 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8rv\" (UniqueName: \"kubernetes.io/projected/4151df80-e101-4d3f-a89d-f0d2c0217a27-kube-api-access-rn8rv\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.873079 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873015 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysconfig\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873079 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873040 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-run\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49q6\" (UniqueName: \"kubernetes.io/projected/51abff1a-6e24-4a67-8755-58b10f566180-kube-api-access-b49q6\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873109 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-system-cni-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.873201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-cnibin\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.873201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873181 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873234 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-host\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873278 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-tmp\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.873375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873311 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4151df80-e101-4d3f-a89d-f0d2c0217a27-serviceca\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.873375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.873336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:39.875237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.874868 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-q9tnl\"" Apr 20 07:02:39.875237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.875090 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.875237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.875149 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.876118 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.876096 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:02:39.876209 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.876149 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:02:39.876279 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.876261 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:02:39.876382 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.876365 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:02:39.876565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.876552 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:02:39.877632 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.877609 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:02:39.878107 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.878090 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4x97x\"" Apr 20 07:02:39.878193 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.878104 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:02:39.924149 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.924114 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:38 +0000 UTC" deadline="2027-12-01 00:44:27.450721972 +0000 UTC" Apr 20 07:02:39.924149 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.924144 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14153h41m47.526581084s" Apr 20 07:02:39.963510 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.963473 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:02:39.973653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973609 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-bin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.973653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973643 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-kubelet\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysconfig\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973705 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-run\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973731 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-cnibin\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-conf-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-systemd-units\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-bin\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973786 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysconfig\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-env-overrides\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-tmp\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4151df80-e101-4d3f-a89d-f0d2c0217a27-serviceca\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973871 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:39.973886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-script-lib\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973908 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-var-lib-kubelet\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973910 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-cnibin\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973930 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973948 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-device-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-cnibin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-cni-binary-copy\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-kubelet\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974021 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-daemon-config\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974104 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-var-lib-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974139 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdl7\" (UniqueName: \"kubernetes.io/projected/ade86a8b-7468-4238-b3ff-728e885f0d78-kube-api-access-jjdl7\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-iptables-alerter-script\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974188 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-node-log\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974212 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974236 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvl7b\" (UniqueName: \"kubernetes.io/projected/3c81fb54-953c-4feb-8550-bcf26cec6a9e-kube-api-access-qvl7b\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-os-release\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-systemd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974310 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-multus\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974325 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-etc-kubernetes\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974346 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-slash\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974364 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974406 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-sys\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974415 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4151df80-e101-4d3f-a89d-f0d2c0217a27-serviceca\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974433 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974460 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-multus-certs\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974499 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-netns\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974537 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b49q6\" (UniqueName: \"kubernetes.io/projected/51abff1a-6e24-4a67-8755-58b10f566180-kube-api-access-b49q6\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974566 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-system-cni-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974598 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-socket-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974631 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6tg4\" (UniqueName: \"kubernetes.io/projected/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-kube-api-access-n6tg4\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:39.974888 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974669 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.973870 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-run\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974724 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-host\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-system-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974781 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-netns\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h6gx\" (UniqueName: \"kubernetes.io/projected/b253a09a-eb71-4d38-961f-3acd58a8ed07-kube-api-access-4h6gx\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974841 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-kubernetes\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-lib-modules\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974883 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974899 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-os-release\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-hostroot\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974931 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-etc-tuned\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974948 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-log-socket\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-config\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.974979 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-modprobe-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975003 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-systemd\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975034 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:02:39.975555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975081 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-registration-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975087 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-var-lib-kubelet\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975111 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-sys-fs\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975139 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-host-slash\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975178 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-ovn\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4151df80-e101-4d3f-a89d-f0d2c0217a27-host\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975276 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975305 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dzb\" (UniqueName: \"kubernetes.io/projected/fc07195a-cdd1-494d-b741-97e9b77b3f6d-kube-api-access-l8dzb\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlh8k\" (UniqueName: \"kubernetes.io/projected/f6657021-5dbd-4acf-9163-5e795f9d3f17-kube-api-access-tlh8k\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975245 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975491 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4151df80-e101-4d3f-a89d-f0d2c0217a27-host\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975513 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-kubernetes\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975628 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-modprobe-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975757 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-host\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-os-release\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ade86a8b-7468-4238-b3ff-728e885f0d78-system-cni-dir\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.976591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975922 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-systemd\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.975983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-lib-modules\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-sys\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976367 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-etc-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976439 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-netd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.976465 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976491 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-d\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-etc-selinux\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976547 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-socket-dir-parent\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ade86a8b-7468-4238-b3ff-728e885f0d78-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.976589 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:40.476559084 +0000 UTC m=+3.119868235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976635 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-k8s-cni-cncf-io\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3d30794-2dde-421b-92b3-bb5c7855130c-agent-certs\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976708 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3d30794-2dde-421b-92b3-bb5c7855130c-konnectivity-ca\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-conf\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.977573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.976772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8rv\" (UniqueName: \"kubernetes.io/projected/4151df80-e101-4d3f-a89d-f0d2c0217a27-kube-api-access-rn8rv\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.978392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.977163 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/51abff1a-6e24-4a67-8755-58b10f566180-etc-sysctl-conf\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.978392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.977660 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a3d30794-2dde-421b-92b3-bb5c7855130c-konnectivity-ca\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.978705 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.978684 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-etc-tuned\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.978813 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.978736 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51abff1a-6e24-4a67-8755-58b10f566180-tmp\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.979215 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.979193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a3d30794-2dde-421b-92b3-bb5c7855130c-agent-certs\") pod \"konnectivity-agent-677wv\" (UID: \"a3d30794-2dde-421b-92b3-bb5c7855130c\") " pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:39.983097 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.983047 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:39.983097 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.983091 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:39.983236 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.983107 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:39.983236 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:39.983191 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:02:40.483164537 +0000 UTC m=+3.126473678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:39.993257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.993221 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49q6\" (UniqueName: \"kubernetes.io/projected/51abff1a-6e24-4a67-8755-58b10f566180-kube-api-access-b49q6\") pod \"tuned-kvspn\" (UID: \"51abff1a-6e24-4a67-8755-58b10f566180\") " pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:39.994605 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.994576 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dzb\" (UniqueName: \"kubernetes.io/projected/fc07195a-cdd1-494d-b741-97e9b77b3f6d-kube-api-access-l8dzb\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:39.995295 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.995265 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8rv\" (UniqueName: \"kubernetes.io/projected/4151df80-e101-4d3f-a89d-f0d2c0217a27-kube-api-access-rn8rv\") pod \"node-ca-pqdp9\" (UID: \"4151df80-e101-4d3f-a89d-f0d2c0217a27\") " pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:39.996446 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:39.996424 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdl7\" (UniqueName: \"kubernetes.io/projected/ade86a8b-7468-4238-b3ff-728e885f0d78-kube-api-access-jjdl7\") pod \"multus-additional-cni-plugins-vtmbx\" (UID: \"ade86a8b-7468-4238-b3ff-728e885f0d78\") " pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:40.077415 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077415 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077421 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-systemd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077444 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-multus\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-etc-kubernetes\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-slash\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-multus\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-etc-kubernetes\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-slash\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077581 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077547 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-systemd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-multus-certs\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-multus-certs\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.077651 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-netns\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-netns\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-socket-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6tg4\" (UniqueName: \"kubernetes.io/projected/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-kube-api-access-n6tg4\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077794 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-system-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-socket-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-netns\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077844 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h6gx\" (UniqueName: \"kubernetes.io/projected/b253a09a-eb71-4d38-961f-3acd58a8ed07-kube-api-access-4h6gx\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-system-cni-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077875 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-netns\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-os-release\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-hostroot\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-log-socket\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077975 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-config\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.077997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-hostroot\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078006 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-registration-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078036 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-sys-fs\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-os-release\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-host-slash\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-registration-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078106 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-ovn\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078116 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-sys-fs\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078636 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-config\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.078949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078760 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-host-slash\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.079322 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.078695 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-log-socket\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.079406 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079386 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-run-ovn\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.079477 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079434 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.079514 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079479 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlh8k\" (UniqueName: \"kubernetes.io/projected/f6657021-5dbd-4acf-9163-5e795f9d3f17-kube-api-access-tlh8k\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.079545 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079512 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-etc-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.079575 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079544 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-netd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.079646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079621 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-etc-selinux\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.079772 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.079721 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080023 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-etc-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080120 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080087 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-netd\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080326 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-etc-selinux\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.080400 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-socket-dir-parent\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080400 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080393 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-k8s-cni-cncf-io\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-bin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-kubelet\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-conf-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-systemd-units\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-bin\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-env-overrides\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-script-lib\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080624 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-device-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.080655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080649 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-cnibin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-cni-binary-copy\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080688 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-kubelet\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080713 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-daemon-config\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080733 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-var-lib-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-iptables-alerter-script\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080791 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-node-log\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.080924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.080831 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvl7b\" (UniqueName: \"kubernetes.io/projected/3c81fb54-953c-4feb-8550-bcf26cec6a9e-kube-api-access-qvl7b\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.081164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081120 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-socket-dir-parent\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.081197 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081171 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-run-k8s-cni-cncf-io\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.081230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-cni-bin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.081262 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081231 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-kubelet\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.081296 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081265 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-conf-dir\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.081296 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-systemd-units\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.081361 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-host-cni-bin\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.081752 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.081729 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-env-overrides\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.082237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovnkube-script-lib\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.082320 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-var-lib-openvswitch\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.082635 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-multus-daemon-config\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.082727 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082693 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6657021-5dbd-4acf-9163-5e795f9d3f17-device-dir\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.082782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082729 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-iptables-alerter-script\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.082782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082752 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-cnibin\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.082880 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082799 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c81fb54-953c-4feb-8550-bcf26cec6a9e-node-log\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.082880 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.082825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b253a09a-eb71-4d38-961f-3acd58a8ed07-host-var-lib-kubelet\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.083309 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.083289 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b253a09a-eb71-4d38-961f-3acd58a8ed07-cni-binary-copy\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.088972 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.088909 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c81fb54-953c-4feb-8550-bcf26cec6a9e-ovn-node-metrics-cert\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.091769 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.091744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6tg4\" (UniqueName: \"kubernetes.io/projected/ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea-kube-api-access-n6tg4\") pod \"iptables-alerter-txpkt\" (UID: \"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea\") " pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.091769 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.091764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h6gx\" (UniqueName: \"kubernetes.io/projected/b253a09a-eb71-4d38-961f-3acd58a8ed07-kube-api-access-4h6gx\") pod \"multus-c66hd\" (UID: \"b253a09a-eb71-4d38-961f-3acd58a8ed07\") " pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.092075 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.092036 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlh8k\" (UniqueName: \"kubernetes.io/projected/f6657021-5dbd-4acf-9163-5e795f9d3f17-kube-api-access-tlh8k\") pod \"aws-ebs-csi-driver-node-xbjp4\" (UID: \"f6657021-5dbd-4acf-9163-5e795f9d3f17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.092438 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.092409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvl7b\" (UniqueName: \"kubernetes.io/projected/3c81fb54-953c-4feb-8550-bcf26cec6a9e-kube-api-access-qvl7b\") pod \"ovnkube-node-vz4hl\" (UID: \"3c81fb54-953c-4feb-8550-bcf26cec6a9e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.165294 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.165256 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:02:40.173164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.173129 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kvspn" Apr 20 07:02:40.180903 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.180876 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqdp9" Apr 20 07:02:40.184642 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.184614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" Apr 20 07:02:40.193506 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.193477 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" Apr 20 07:02:40.201183 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.201155 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c66hd" Apr 20 07:02:40.207960 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.207927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-txpkt" Apr 20 07:02:40.213801 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.213777 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:02:40.314025 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.313994 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:40.483365 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.483288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:40.483365 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.483346 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483443 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483477 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483493 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483505 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483515 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:41.483498761 +0000 UTC m=+4.126807894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:40.483568 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:40.483557 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:02:41.483540468 +0000 UTC m=+4.126849610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:40.683206 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.683170 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6657021_5dbd_4acf_9163_5e795f9d3f17.slice/crio-f581042a543cf6a8892d68c034e3f60f27519415218071802024e3226493efd6 WatchSource:0}: Error finding container f581042a543cf6a8892d68c034e3f60f27519415218071802024e3226493efd6: Status 404 returned error can't find the container with id f581042a543cf6a8892d68c034e3f60f27519415218071802024e3226493efd6 Apr 20 07:02:40.684156 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.684123 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3d30794_2dde_421b_92b3_bb5c7855130c.slice/crio-429bf08e9ac19d07e1499de1a9293a9f087acec91f31d5b5fdd83d794e33ba13 WatchSource:0}: Error finding container 429bf08e9ac19d07e1499de1a9293a9f087acec91f31d5b5fdd83d794e33ba13: Status 404 returned error can't find the container with id 429bf08e9ac19d07e1499de1a9293a9f087acec91f31d5b5fdd83d794e33ba13 Apr 20 07:02:40.685228 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.685204 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade86a8b_7468_4238_b3ff_728e885f0d78.slice/crio-a38621d46e428a05e051add00e5f27b27627adae874d292a362647bd8a333575 WatchSource:0}: Error finding container a38621d46e428a05e051add00e5f27b27627adae874d292a362647bd8a333575: Status 404 returned error can't find the container with id a38621d46e428a05e051add00e5f27b27627adae874d292a362647bd8a333575 Apr 20 07:02:40.685913 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.685806 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff49ceb7_70c3_47b0_81dd_3ed0c02c42ea.slice/crio-c84d16d574ed5924faf4602b6a26ae3266bd8774b9b2fe7c182cb85198897ff2 WatchSource:0}: Error finding container c84d16d574ed5924faf4602b6a26ae3266bd8774b9b2fe7c182cb85198897ff2: Status 404 returned error can't find the container with id c84d16d574ed5924faf4602b6a26ae3266bd8774b9b2fe7c182cb85198897ff2 Apr 20 07:02:40.689018 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.688996 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb253a09a_eb71_4d38_961f_3acd58a8ed07.slice/crio-b632c788a3486b10ea486c61288e76dc8008613f0fbb8ce5aaa2f83d54e1c832 WatchSource:0}: Error finding container b632c788a3486b10ea486c61288e76dc8008613f0fbb8ce5aaa2f83d54e1c832: Status 404 returned error can't find the container with id b632c788a3486b10ea486c61288e76dc8008613f0fbb8ce5aaa2f83d54e1c832 Apr 20 07:02:40.710458 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.710421 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c81fb54_953c_4feb_8550_bcf26cec6a9e.slice/crio-041a7d7e64d8b65a784d0680a486a7aa82d3ac5dc8cd8cf879dab2a83284a379 WatchSource:0}: Error finding container 041a7d7e64d8b65a784d0680a486a7aa82d3ac5dc8cd8cf879dab2a83284a379: Status 404 returned error can't find the container with id 041a7d7e64d8b65a784d0680a486a7aa82d3ac5dc8cd8cf879dab2a83284a379 Apr 20 07:02:40.712827 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.712801 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4151df80_e101_4d3f_a89d_f0d2c0217a27.slice/crio-652f063489e452ccf288f51b424b022caa4300e095f7383e9f80405e012f8b11 WatchSource:0}: Error finding container 652f063489e452ccf288f51b424b022caa4300e095f7383e9f80405e012f8b11: Status 404 returned error can't find the container with id 652f063489e452ccf288f51b424b022caa4300e095f7383e9f80405e012f8b11 Apr 20 07:02:40.713947 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:02:40.713923 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51abff1a_6e24_4a67_8755_58b10f566180.slice/crio-189511c3b6d48bbc5c63deb39c02f870fff69767bd2f317f2961c51ee5eed268 WatchSource:0}: Error finding container 189511c3b6d48bbc5c63deb39c02f870fff69767bd2f317f2961c51ee5eed268: Status 404 returned error can't find the container with id 189511c3b6d48bbc5c63deb39c02f870fff69767bd2f317f2961c51ee5eed268 Apr 20 07:02:40.925038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.924835 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:38 +0000 UTC" deadline="2028-01-19 15:02:20.663163616 +0000 UTC" Apr 20 07:02:40.925038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.925036 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15343h59m39.738132925s" Apr 20 07:02:40.942871 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.942836 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" event={"ID":"06a23782138e9e64504f2d8b6d92effd","Type":"ContainerStarted","Data":"d6cbe6366864f355a5aed968e29b31136f3787a17232151ed70aacf32d46677d"} Apr 20 07:02:40.943838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.943812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqdp9" event={"ID":"4151df80-e101-4d3f-a89d-f0d2c0217a27","Type":"ContainerStarted","Data":"652f063489e452ccf288f51b424b022caa4300e095f7383e9f80405e012f8b11"} Apr 20 07:02:40.944910 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.944882 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c66hd" event={"ID":"b253a09a-eb71-4d38-961f-3acd58a8ed07","Type":"ContainerStarted","Data":"b632c788a3486b10ea486c61288e76dc8008613f0fbb8ce5aaa2f83d54e1c832"} Apr 20 07:02:40.945883 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.945856 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-txpkt" event={"ID":"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea","Type":"ContainerStarted","Data":"c84d16d574ed5924faf4602b6a26ae3266bd8774b9b2fe7c182cb85198897ff2"} Apr 20 07:02:40.946856 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.946830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-677wv" event={"ID":"a3d30794-2dde-421b-92b3-bb5c7855130c","Type":"ContainerStarted","Data":"429bf08e9ac19d07e1499de1a9293a9f087acec91f31d5b5fdd83d794e33ba13"} Apr 20 07:02:40.947787 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.947762 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kvspn" event={"ID":"51abff1a-6e24-4a67-8755-58b10f566180","Type":"ContainerStarted","Data":"189511c3b6d48bbc5c63deb39c02f870fff69767bd2f317f2961c51ee5eed268"} Apr 20 07:02:40.949264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.949219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"041a7d7e64d8b65a784d0680a486a7aa82d3ac5dc8cd8cf879dab2a83284a379"} Apr 20 07:02:40.950838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.950812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerStarted","Data":"a38621d46e428a05e051add00e5f27b27627adae874d292a362647bd8a333575"} Apr 20 07:02:40.951891 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.951874 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" event={"ID":"f6657021-5dbd-4acf-9163-5e795f9d3f17","Type":"ContainerStarted","Data":"f581042a543cf6a8892d68c034e3f60f27519415218071802024e3226493efd6"} Apr 20 07:02:40.959838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:40.959794 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-100.ec2.internal" podStartSLOduration=2.959780333 podStartE2EDuration="2.959780333s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:02:40.95957311 +0000 UTC m=+3.602882272" watchObservedRunningTime="2026-04-20 07:02:40.959780333 +0000 UTC m=+3.603089480" Apr 20 07:02:41.491799 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.491767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:41.491922 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.491833 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:41.491991 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.491954 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:41.492047 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.492016 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:43.491998515 +0000 UTC m=+6.135307647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:41.492461 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.492443 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:41.492528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.492465 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:41.492528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.492478 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:41.492528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.492520 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:02:43.492506309 +0000 UTC m=+6.135815442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:41.936391 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.936243 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:41.936839 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.936505 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:41.937123 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.937104 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:41.937215 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:41.937193 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:41.962972 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.961982 2566 generic.go:358] "Generic (PLEG): container finished" podID="f16f90fea78edadec0b74bae56286c52" containerID="e2d3a8f1b147ba8c2bbdddda2130af0c12e7046642320477d82ebb9bd00278b6" exitCode=0 Apr 20 07:02:41.962972 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:41.962775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" event={"ID":"f16f90fea78edadec0b74bae56286c52","Type":"ContainerDied","Data":"e2d3a8f1b147ba8c2bbdddda2130af0c12e7046642320477d82ebb9bd00278b6"} Apr 20 07:02:42.978260 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:42.978217 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" event={"ID":"f16f90fea78edadec0b74bae56286c52","Type":"ContainerStarted","Data":"e786c6e8c4dbfb21cff4725669200730c07101364d0f785082b3ce173a935558"} Apr 20 07:02:43.512715 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:43.512682 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:43.512897 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:43.512745 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:43.512897 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.512887 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:43.513010 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.512954 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:47.51292473 +0000 UTC m=+10.156233865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:43.513085 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.513028 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:43.513085 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.513048 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:43.513085 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.513074 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:43.513233 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.513115 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:02:47.51310239 +0000 UTC m=+10.156411521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:43.935283 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:43.935205 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:43.935441 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.935410 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:43.935927 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:43.935908 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:43.936041 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:43.936021 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:45.936232 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:45.936192 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:45.936647 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:45.936199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:45.936647 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:45.936330 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:45.936647 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:45.936428 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:45.990279 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:45.990225 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-100.ec2.internal" podStartSLOduration=7.990206244 podStartE2EDuration="7.990206244s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:02:42.993228192 +0000 UTC m=+5.636537342" watchObservedRunningTime="2026-04-20 07:02:45.990206244 +0000 UTC m=+8.633515387" Apr 20 07:02:45.991516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:45.991490 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gxm4n"] Apr 20 07:02:45.999742 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:45.999434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:45.999742 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:45.999512 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:46.032972 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.032944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-dbus\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.033145 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.032995 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-kubelet-config\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.033145 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.033127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.134054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.134013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.134230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.134104 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-dbus\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.134230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.134143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-kubelet-config\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.134356 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.134241 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-kubelet-config\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.134414 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:46.134357 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:46.134466 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:46.134420 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:02:46.634401658 +0000 UTC m=+9.277710786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:46.134759 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.134738 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/972f83a1-e686-4970-8f9a-46f92da2158f-dbus\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.638643 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:46.638610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:46.638839 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:46.638760 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:46.638839 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:46.638817 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:02:47.638804 +0000 UTC m=+10.282113127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:47.545843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.545798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:47.546336 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.545879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:47.546336 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.545998 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:47.546336 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.546075 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:55.546038006 +0000 UTC m=+18.189347138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:47.546582 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.546555 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:47.546582 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.546580 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:47.546713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.546593 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:47.546713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.546642 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:02:55.546625974 +0000 UTC m=+18.189935116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:47.646662 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.646621 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:47.646859 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.646806 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:47.646922 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.646872 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:02:49.646853752 +0000 UTC m=+12.290162896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.936684 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.936832 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.936852 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.936948 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:47.937045 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:47.937264 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:47.937155 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:49.664199 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:49.664161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:49.664560 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:49.664322 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:49.664560 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:49.664397 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:02:53.664378063 +0000 UTC m=+16.307687204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:49.935191 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:49.935316 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:49.935737 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:49.935838 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:49.935895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:49.936048 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:49.935966 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:51.935796 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:51.935753 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:51.936254 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:51.935806 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:51.936254 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:51.935875 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:51.936254 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:51.935753 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:51.936254 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:51.936038 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:51.936254 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:51.936135 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:53.688677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:53.688644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:53.689086 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:53.688790 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:53.689086 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:53.688844 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:03:01.68883113 +0000 UTC m=+24.332140263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:53.935442 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:53.935407 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:53.935442 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:53.935438 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:53.935659 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:53.935516 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:53.935659 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:53.935527 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:53.935659 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:53.935619 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:53.935794 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:53.935718 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:55.601113 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:55.601071 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:55.601155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601181 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601262 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:11.60124053 +0000 UTC m=+34.244549674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601307 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601325 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601336 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:55.601713 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.601387 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:03:11.601370494 +0000 UTC m=+34.244679636 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:55.935646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:55.935560 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:55.935791 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.935674 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:55.936106 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:55.936087 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:55.936192 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.936175 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:55.936257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:55.936227 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:55.936311 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:55.936289 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:57.934952 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:57.934875 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:57.935679 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:57.935663 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:57.935746 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:57.935667 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:57.935802 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:57.935747 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:57.935858 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:57.935810 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:57.935924 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:57.935898 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:59.008336 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.007875 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kvspn" event={"ID":"51abff1a-6e24-4a67-8755-58b10f566180","Type":"ContainerStarted","Data":"f0570e4b6d169a412422a25f3f5913791dcd31ce3096a65499f2ed7430a7f39e"} Apr 20 07:02:59.010579 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010554 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:02:59.010892 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010868 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c81fb54-953c-4feb-8550-bcf26cec6a9e" containerID="c49eb414985e4c49e3369c3eac1d65247135aa40e99fa4583d249bb4d08afc98" exitCode=1 Apr 20 07:02:59.010995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010942 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"47df97448affa190e4dcd5ef4e765c7e2e99c09a413499fea3016f6a913a8e79"} Apr 20 07:02:59.010995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010967 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"04521e3833a345ebc04555a530efdfe6318edd4adbe93529a5be905e2104f268"} Apr 20 07:02:59.010995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010981 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"2700c14a9b8ece9df25a6d9587838a2e5656aab81c5ec9506fb0b57fba24f61f"} Apr 20 07:02:59.010995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.010993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"1e4d80c27a1aecee031a631cd183198b9116b3da7c5514af4e6536860e8aa947"} Apr 20 07:02:59.011169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.011005 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerDied","Data":"c49eb414985e4c49e3369c3eac1d65247135aa40e99fa4583d249bb4d08afc98"} Apr 20 07:02:59.011169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.011019 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"e578a05056054ce8980509e4e687dce812b87a2b5afe33b436e74211f086990b"} Apr 20 07:02:59.014840 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.014812 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="b7dddd29a3015b0aa567ff38d523ee093dc32fe7740d1d0edb93f835fee059f2" exitCode=0 Apr 20 07:02:59.014967 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.014889 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"b7dddd29a3015b0aa567ff38d523ee093dc32fe7740d1d0edb93f835fee059f2"} Apr 20 07:02:59.017025 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.016999 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" event={"ID":"f6657021-5dbd-4acf-9163-5e795f9d3f17","Type":"ContainerStarted","Data":"22491eb0df221e9d2a0f4e67c8f509385b350901512dd9fe26b3ef0a5f6be4e0"} Apr 20 07:02:59.018825 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.018795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqdp9" event={"ID":"4151df80-e101-4d3f-a89d-f0d2c0217a27","Type":"ContainerStarted","Data":"3267fc735d06f4b51c86582e277c5103068029b38fd0364eb9857055434a5e95"} Apr 20 07:02:59.020211 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.020180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c66hd" event={"ID":"b253a09a-eb71-4d38-961f-3acd58a8ed07","Type":"ContainerStarted","Data":"200ac8c392d416357aa21413c7c0594ddd4c2709ee6b68f0da8c23f37dbc7b7e"} Apr 20 07:02:59.021646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.021628 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-677wv" event={"ID":"a3d30794-2dde-421b-92b3-bb5c7855130c","Type":"ContainerStarted","Data":"18f265a63ec97775bc3b70829b44769291a136241b20a004a3343e239d8dd573"} Apr 20 07:02:59.028595 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.028541 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kvspn" podStartSLOduration=3.662358035 podStartE2EDuration="21.028525127s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.715311582 +0000 UTC m=+3.358620710" lastFinishedPulling="2026-04-20 07:02:58.081478659 +0000 UTC m=+20.724787802" observedRunningTime="2026-04-20 07:02:59.028096861 +0000 UTC m=+21.671406008" watchObservedRunningTime="2026-04-20 07:02:59.028525127 +0000 UTC m=+21.671834278" Apr 20 07:02:59.076137 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.076049 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pqdp9" podStartSLOduration=4.113873779 podStartE2EDuration="21.076030104s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.715356796 +0000 UTC m=+3.358665928" lastFinishedPulling="2026-04-20 07:02:57.677513106 +0000 UTC m=+20.320822253" observedRunningTime="2026-04-20 07:02:59.048240414 +0000 UTC m=+21.691549565" watchObservedRunningTime="2026-04-20 07:02:59.076030104 +0000 UTC m=+21.719339253" Apr 20 07:02:59.105358 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.105255 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c66hd" podStartSLOduration=3.718872335 podStartE2EDuration="21.10523634s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.708892883 +0000 UTC m=+3.352202012" lastFinishedPulling="2026-04-20 07:02:58.095256885 +0000 UTC m=+20.738566017" observedRunningTime="2026-04-20 07:02:59.104840144 +0000 UTC m=+21.748149293" watchObservedRunningTime="2026-04-20 07:02:59.10523634 +0000 UTC m=+21.748545477" Apr 20 07:02:59.148812 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.148752 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-677wv" podStartSLOduration=8.587542817 podStartE2EDuration="21.148736014s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.687270667 +0000 UTC m=+3.330579796" lastFinishedPulling="2026-04-20 07:02:53.248463856 +0000 UTC m=+15.891772993" observedRunningTime="2026-04-20 07:02:59.148658519 +0000 UTC m=+21.791967659" watchObservedRunningTime="2026-04-20 07:02:59.148736014 +0000 UTC m=+21.792045164" Apr 20 07:02:59.748247 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.748219 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:02:59.935123 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.935039 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:02:59.935285 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.935046 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:02:59.935285 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:59.935194 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:02:59.935285 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:59.935248 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:02:59.935285 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.935046 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:02:59.935414 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:02:59.935326 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:02:59.944424 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.944340 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:02:59.748242259Z","UUID":"7381aab6-a47c-4fe6-8a01-1a14a2bfb606","Handler":null,"Name":"","Endpoint":""} Apr 20 07:02:59.946048 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.946030 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:02:59.946186 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:02:59.946079 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:03:00.024426 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:00.024392 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-txpkt" event={"ID":"ff49ceb7-70c3-47b0-81dd-3ed0c02c42ea","Type":"ContainerStarted","Data":"9aef190f9c60da783a7b2c99d0f7aa7319c5a7e371fec7bdc03d0dfb90f73a75"} Apr 20 07:03:00.026229 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:00.026202 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" event={"ID":"f6657021-5dbd-4acf-9163-5e795f9d3f17","Type":"ContainerStarted","Data":"6c7f4398b0765238af19749e0ae077926dae49687a025f6c00f667f3dc009a97"} Apr 20 07:03:00.058672 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:00.058619 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-txpkt" podStartSLOduration=4.720053331 podStartE2EDuration="22.058601508s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.687956766 +0000 UTC m=+3.331265898" lastFinishedPulling="2026-04-20 07:02:58.026504929 +0000 UTC m=+20.669814075" observedRunningTime="2026-04-20 07:03:00.058304581 +0000 UTC m=+22.701613732" watchObservedRunningTime="2026-04-20 07:03:00.058601508 +0000 UTC m=+22.701910659" Apr 20 07:03:01.233224 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.233196 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:03:01.233694 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.233672 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:03:01.748642 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.748379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:01.748642 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:01.748514 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:01.748849 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:01.748707 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret podName:972f83a1-e686-4970-8f9a-46f92da2158f nodeName:}" failed. No retries permitted until 2026-04-20 07:03:17.748688603 +0000 UTC m=+40.391997737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret") pod "global-pull-secret-syncer-gxm4n" (UID: "972f83a1-e686-4970-8f9a-46f92da2158f") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:01.934974 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.934937 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:01.935176 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.934944 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:01.935176 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:01.935045 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:01.935176 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:01.934944 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:01.935362 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:01.935185 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:01.935362 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:01.935230 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:02.031625 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.031586 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" event={"ID":"f6657021-5dbd-4acf-9163-5e795f9d3f17","Type":"ContainerStarted","Data":"efca42888194f29535f07734b12a1a226c42c096fb595419279a94ae430fd840"} Apr 20 07:03:02.034447 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.034422 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:03:02.034741 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.034718 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"39f3528767edb844ee2a9c2b71b3722f18f205437cc6031a86a1c85f8a2533d2"} Apr 20 07:03:02.034974 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.034938 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:03:02.035599 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.035582 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-677wv" Apr 20 07:03:02.071964 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.071909 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xbjp4" podStartSLOduration=3.570974203 podStartE2EDuration="24.071893585s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.686527207 +0000 UTC m=+3.329836352" lastFinishedPulling="2026-04-20 07:03:01.187446588 +0000 UTC m=+23.830755734" observedRunningTime="2026-04-20 07:03:02.052223872 +0000 UTC m=+24.695533024" watchObservedRunningTime="2026-04-20 07:03:02.071893585 +0000 UTC m=+24.715202772" Apr 20 07:03:02.955023 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.954993 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-znmbh"] Apr 20 07:03:02.959894 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.959867 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:02.963324 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.963303 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:03:02.963455 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.963349 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:03:02.963455 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:02.963437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n8f2t\"" Apr 20 07:03:03.058105 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.058051 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbn2\" (UniqueName: \"kubernetes.io/projected/13612f1f-99a2-46b0-abc6-8e800feca8e9-kube-api-access-xdbn2\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.058257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.058176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13612f1f-99a2-46b0-abc6-8e800feca8e9-tmp-dir\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.058257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.058235 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13612f1f-99a2-46b0-abc6-8e800feca8e9-hosts-file\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.159454 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.159419 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13612f1f-99a2-46b0-abc6-8e800feca8e9-tmp-dir\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.159638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.159490 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13612f1f-99a2-46b0-abc6-8e800feca8e9-hosts-file\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.159638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.159530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbn2\" (UniqueName: \"kubernetes.io/projected/13612f1f-99a2-46b0-abc6-8e800feca8e9-kube-api-access-xdbn2\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.159638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.159581 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13612f1f-99a2-46b0-abc6-8e800feca8e9-hosts-file\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.159824 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.159792 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13612f1f-99a2-46b0-abc6-8e800feca8e9-tmp-dir\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.174895 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.174864 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbn2\" (UniqueName: \"kubernetes.io/projected/13612f1f-99a2-46b0-abc6-8e800feca8e9-kube-api-access-xdbn2\") pod \"node-resolver-znmbh\" (UID: \"13612f1f-99a2-46b0-abc6-8e800feca8e9\") " pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.271078 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.271024 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-znmbh" Apr 20 07:03:03.935932 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.935897 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:03.935932 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.935923 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:03.936195 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:03.935933 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:03.936195 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:03.936008 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:03.936195 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:03.936109 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:03.936195 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:03.936165 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:03.976593 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:03.976561 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13612f1f_99a2_46b0_abc6_8e800feca8e9.slice/crio-225d3c34a75d89a6aa5653cb2ef12d5f7aa889683ac355d56c915cd080e689f8 WatchSource:0}: Error finding container 225d3c34a75d89a6aa5653cb2ef12d5f7aa889683ac355d56c915cd080e689f8: Status 404 returned error can't find the container with id 225d3c34a75d89a6aa5653cb2ef12d5f7aa889683ac355d56c915cd080e689f8 Apr 20 07:03:04.039577 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:04.039521 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-znmbh" event={"ID":"13612f1f-99a2-46b0-abc6-8e800feca8e9","Type":"ContainerStarted","Data":"225d3c34a75d89a6aa5653cb2ef12d5f7aa889683ac355d56c915cd080e689f8"} Apr 20 07:03:05.044073 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.043893 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:03:05.044546 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.044387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"eaf78da20d36bec93408f0754a91ee2c792a2c97e18e5b7fe96dc30909272d9f"} Apr 20 07:03:05.044781 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.044759 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:05.044926 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.044906 2566 scope.go:117] "RemoveContainer" containerID="c49eb414985e4c49e3369c3eac1d65247135aa40e99fa4583d249bb4d08afc98" Apr 20 07:03:05.045170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.044914 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:05.045255 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.045184 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:05.048720 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.048691 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="cd5c56268c1ba7cab46b9eaa12763745e90d78a57967d52a279a81833fe8ddaa" exitCode=0 Apr 20 07:03:05.048825 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.048771 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"cd5c56268c1ba7cab46b9eaa12763745e90d78a57967d52a279a81833fe8ddaa"} Apr 20 07:03:05.050585 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.050561 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-znmbh" event={"ID":"13612f1f-99a2-46b0-abc6-8e800feca8e9","Type":"ContainerStarted","Data":"70031d2a9d890088d48a929ff49b89f37b27c1a33dad1f87f76fddc3839254e4"} Apr 20 07:03:05.063230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.063209 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:05.063330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.063317 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:05.140402 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.140361 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-znmbh" podStartSLOduration=3.140347474 podStartE2EDuration="3.140347474s" podCreationTimestamp="2026-04-20 07:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:05.140054886 +0000 UTC m=+27.783364037" watchObservedRunningTime="2026-04-20 07:03:05.140347474 +0000 UTC m=+27.783656624" Apr 20 07:03:05.935858 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.935824 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:05.935858 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.935841 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:05.936102 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:05.935831 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:05.936102 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:05.935945 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:05.936102 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:05.936011 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:05.936211 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:05.936156 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:06.056025 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.055995 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:03:06.057468 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.057434 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" event={"ID":"3c81fb54-953c-4feb-8550-bcf26cec6a9e","Type":"ContainerStarted","Data":"68231d1c3a35952bd78b1894cb2791bfd02bb867f87688d537c566c02878a908"} Apr 20 07:03:06.113314 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.113264 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" podStartSLOduration=10.692747321 podStartE2EDuration="28.113250375s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.714485102 +0000 UTC m=+3.357794229" lastFinishedPulling="2026-04-20 07:02:58.13498814 +0000 UTC m=+20.778297283" observedRunningTime="2026-04-20 07:03:06.110546812 +0000 UTC m=+28.753855962" watchObservedRunningTime="2026-04-20 07:03:06.113250375 +0000 UTC m=+28.756559525" Apr 20 07:03:06.337682 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.337642 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h8v9g"] Apr 20 07:03:06.337813 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.337785 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:06.337915 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:06.337892 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:06.364472 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.364443 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gxm4n"] Apr 20 07:03:06.364573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.364540 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:06.364632 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:06.364614 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:06.380340 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.380318 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7cnrw"] Apr 20 07:03:06.380429 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:06.380411 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:06.380504 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:06.380485 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:07.060492 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:07.060455 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="340a0c5d0f1ec017b9c97252c91b4e8bbea73416e2f4d77894f60cf26d362032" exitCode=0 Apr 20 07:03:07.060861 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:07.060536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"340a0c5d0f1ec017b9c97252c91b4e8bbea73416e2f4d77894f60cf26d362032"} Apr 20 07:03:07.936079 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:07.935873 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:07.936228 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:07.935935 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:07.936228 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:07.936160 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:07.936228 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:07.935955 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:07.936390 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:07.936220 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:07.936390 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:07.936295 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:09.066022 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:09.065992 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="4694e15826be278692c1277e61c07d703078db060bc98ec90beebe19c51a937e" exitCode=0 Apr 20 07:03:09.066443 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:09.066045 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"4694e15826be278692c1277e61c07d703078db060bc98ec90beebe19c51a937e"} Apr 20 07:03:09.935772 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:09.935738 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:09.935931 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:09.935877 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:09.935931 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:09.935917 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:09.936049 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:09.936022 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:09.936049 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:09.935738 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:09.936170 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:09.936136 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:11.618348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:11.618290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:11.618380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618440 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618467 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618482 2566 projected.go:194] Error preparing data for projected volume kube-api-access-s9646 for pod openshift-network-diagnostics/network-check-target-7cnrw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618551 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646 podName:d3c0457e-6ada-43f2-92ce-2786a4e2c17b nodeName:}" failed. No retries permitted until 2026-04-20 07:03:43.618531699 +0000 UTC m=+66.261840829 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s9646" (UniqueName: "kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646") pod "network-check-target-7cnrw" (UID: "d3c0457e-6ada-43f2-92ce-2786a4e2c17b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618553 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:11.618953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.618616 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs podName:fc07195a-cdd1-494d-b741-97e9b77b3f6d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:43.618597045 +0000 UTC m=+66.261906186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs") pod "network-metrics-daemon-h8v9g" (UID: "fc07195a-cdd1-494d-b741-97e9b77b3f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:11.935586 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:11.935502 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:11.935808 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:11.935502 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:11.935808 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.935629 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8v9g" podUID="fc07195a-cdd1-494d-b741-97e9b77b3f6d" Apr 20 07:03:11.935808 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.935733 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gxm4n" podUID="972f83a1-e686-4970-8f9a-46f92da2158f" Apr 20 07:03:11.935808 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:11.935502 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:11.936044 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:11.935830 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7cnrw" podUID="d3c0457e-6ada-43f2-92ce-2786a4e2c17b" Apr 20 07:03:12.179131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.179103 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-100.ec2.internal" event="NodeReady" Apr 20 07:03:12.179396 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.179246 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:03:12.233927 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.233839 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr"] Apr 20 07:03:12.261190 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.261092 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh"] Apr 20 07:03:12.261364 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.261273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" Apr 20 07:03:12.265202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.264843 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.265202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.264988 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-hnp4h\"" Apr 20 07:03:12.265202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.264844 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.273202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.273179 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jlbv6"] Apr 20 07:03:12.273409 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.273388 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.277265 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.277242 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.277380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.277242 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.277601 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.277585 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 07:03:12.277777 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.277763 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-j66ld\"" Apr 20 07:03:12.300480 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.300450 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9"] Apr 20 07:03:12.300654 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.300620 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.305346 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.305322 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 07:03:12.305828 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.305808 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 07:03:12.306080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.306046 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-c4psp\"" Apr 20 07:03:12.306171 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.306147 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.306277 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.306229 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.311172 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.311150 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 07:03:12.316172 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.316144 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p"] Apr 20 07:03:12.337272 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.337213 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:03:12.337493 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.337374 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.337758 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.337696 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.341419 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.341394 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.341542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.341423 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2ptdz\"" Apr 20 07:03:12.341842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.341818 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 07:03:12.341946 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.341822 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-g9whx\"" Apr 20 07:03:12.342027 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.342009 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 07:03:12.343711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.343313 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 07:03:12.343711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.343343 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.343711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.343410 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.343711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.343434 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.343711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.343588 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 07:03:12.364159 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.364117 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2tj5v"] Apr 20 07:03:12.364311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.364271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.376705 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.376681 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 07:03:12.376852 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.376680 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 07:03:12.376852 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.376689 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w2xpz\"" Apr 20 07:03:12.379562 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.379537 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 07:03:12.381957 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.381938 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 07:03:12.388405 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.388378 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4"] Apr 20 07:03:12.388545 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.388520 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.394707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.394387 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 07:03:12.394707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.394452 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.394707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.394702 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.394939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.394819 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 07:03:12.396633 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.396614 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-fshwh\"" Apr 20 07:03:12.403150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403111 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr"] Apr 20 07:03:12.403150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403153 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jlbv6"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403169 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403184 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403199 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403215 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403225 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2tj5v"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403234 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tfcdm"] Apr 20 07:03:12.403316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.403266 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.409746 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.409722 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-qlv8g\"" Apr 20 07:03:12.409966 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.409950 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.410129 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.410115 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 07:03:12.410183 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.410142 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 07:03:12.410860 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.410845 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.411446 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.411429 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 07:03:12.423797 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-snapshots\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.423925 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghggl\" (UniqueName: \"kubernetes.io/projected/31731af1-46d3-416a-98dc-3db15ceaab73-kube-api-access-ghggl\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.423925 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4txh\" (UniqueName: \"kubernetes.io/projected/de3d8741-a673-4d3c-9bd2-323788b79b5a-kube-api-access-p4txh\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.423925 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423906 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-tmp\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423935 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/afe6f307-ed98-475b-8fa0-8a49de31999c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.423954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424050 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62d4\" (UniqueName: \"kubernetes.io/projected/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-kube-api-access-k62d4\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424098 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qg6n\" (UniqueName: \"kubernetes.io/projected/afe6f307-ed98-475b-8fa0-8a49de31999c-kube-api-access-8qg6n\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.424130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424129 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3d8741-a673-4d3c-9bd2-323788b79b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424154 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-service-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424208 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424222 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fbft2"] Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31731af1-46d3-416a-98dc-3db15ceaab73-serving-cert\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424259 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4t4\" (UniqueName: \"kubernetes.io/projected/b328de84-b319-40a6-968c-43ec62190177-kube-api-access-hh4t4\") pod \"volume-data-source-validator-7c6cbb6c87-4pmwr\" (UID: \"b328de84-b319-40a6-968c-43ec62190177\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" Apr 20 07:03:12.424317 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424289 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d8741-a673-4d3c-9bd2-323788b79b5a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.424486 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.424417 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.428801 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.428776 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:03:12.429116 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.429097 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:03:12.429412 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.429389 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhcpq\"" Apr 20 07:03:12.439169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.439145 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7kxww"] Apr 20 07:03:12.439328 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.439311 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.443316 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.443294 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 07:03:12.443429 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.443364 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fcl8n\"" Apr 20 07:03:12.443429 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.443403 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 07:03:12.462387 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.462362 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb"] Apr 20 07:03:12.462547 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.462526 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.466045 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.466022 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-75ldp\"" Apr 20 07:03:12.466164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.466090 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:03:12.466261 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.466022 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.466340 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.466318 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.489743 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.489670 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69d77dd9d6-68vvp"] Apr 20 07:03:12.489871 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.489834 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" Apr 20 07:03:12.496407 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.496384 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.497452 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.497292 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-g8bk2\"" Apr 20 07:03:12.497452 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.497345 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.505004 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.504978 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505012 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7kxww"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505028 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fbft2"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505040 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505052 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tfcdm"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505104 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69d77dd9d6-68vvp"] Apr 20 07:03:12.505131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.505133 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.508895 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.508875 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 07:03:12.509234 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509185 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 07:03:12.509234 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509213 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:03:12.509234 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509221 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:03:12.509428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509249 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6rk52\"" Apr 20 07:03:12.509428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509255 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 07:03:12.509428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.509213 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 07:03:12.525633 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.525766 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525640 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.525766 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525682 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghggl\" (UniqueName: \"kubernetes.io/projected/31731af1-46d3-416a-98dc-3db15ceaab73-kube-api-access-ghggl\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.525766 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525708 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.525766 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525742 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-tmp\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.525766 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525759 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/afe6f307-ed98-475b-8fa0-8a49de31999c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-config\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525797 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525828 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxfn\" (UniqueName: \"kubernetes.io/projected/0134f121-e1b5-45c9-9b45-fc3777f00742-kube-api-access-qbxfn\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525845 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdtc\" (UniqueName: \"kubernetes.io/projected/20140ecd-e12f-48c7-b496-5b64ba19279d-kube-api-access-pcdtc\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525865 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-service-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525901 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjtf\" (UniqueName: \"kubernetes.io/projected/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-kube-api-access-hxjtf\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0134f121-e1b5-45c9-9b45-fc3777f00742-tmp-dir\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525935 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525949 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31731af1-46d3-416a-98dc-3db15ceaab73-serving-cert\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.525986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4t4\" (UniqueName: \"kubernetes.io/projected/b328de84-b319-40a6-968c-43ec62190177-kube-api-access-hh4t4\") pod \"volume-data-source-validator-7c6cbb6c87-4pmwr\" (UID: \"b328de84-b319-40a6-968c-43ec62190177\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" Apr 20 07:03:12.526021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526036 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20140ecd-e12f-48c7-b496-5b64ba19279d-serving-cert\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526083 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d8741-a673-4d3c-9bd2-323788b79b5a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526110 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zm6q\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-snapshots\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526226 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4txh\" (UniqueName: \"kubernetes.io/projected/de3d8741-a673-4d3c-9bd2-323788b79b5a-kube-api-access-p4txh\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526244 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526285 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526312 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k62d4\" (UniqueName: \"kubernetes.io/projected/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-kube-api-access-k62d4\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qg6n\" (UniqueName: \"kubernetes.io/projected/afe6f307-ed98-475b-8fa0-8a49de31999c-kube-api-access-8qg6n\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526364 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526391 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526407 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-config\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.526532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526429 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0134f121-e1b5-45c9-9b45-fc3777f00742-config-volume\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.527283 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3d8741-a673-4d3c-9bd2-323788b79b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.527283 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.526480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-trusted-ca\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.527283 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.527084 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:12.527283 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.527146 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.027131574 +0000 UTC m=+35.670440703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:12.527283 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.527277 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d8741-a673-4d3c-9bd2-323788b79b5a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.527733 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.527712 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:12.527811 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.527781 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.027761617 +0000 UTC m=+35.671070764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:12.527868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.527859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-service-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.528009 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.527923 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-tmp\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.528009 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.527950 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/afe6f307-ed98-475b-8fa0-8a49de31999c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.528281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.528263 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/31731af1-46d3-416a-98dc-3db15ceaab73-snapshots\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.531540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.531516 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31731af1-46d3-416a-98dc-3db15ceaab73-serving-cert\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.541081 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.541037 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31731af1-46d3-416a-98dc-3db15ceaab73-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.541238 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.541217 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3d8741-a673-4d3c-9bd2-323788b79b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.549248 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.549217 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghggl\" (UniqueName: \"kubernetes.io/projected/31731af1-46d3-416a-98dc-3db15ceaab73-kube-api-access-ghggl\") pod \"insights-operator-585dfdc468-jlbv6\" (UID: \"31731af1-46d3-416a-98dc-3db15ceaab73\") " pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.549701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.549682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62d4\" (UniqueName: \"kubernetes.io/projected/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-kube-api-access-k62d4\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:12.549996 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.549980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4txh\" (UniqueName: \"kubernetes.io/projected/de3d8741-a673-4d3c-9bd2-323788b79b5a-kube-api-access-p4txh\") pod \"kube-storage-version-migrator-operator-6769c5d45-ttgx9\" (UID: \"de3d8741-a673-4d3c-9bd2-323788b79b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.550605 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.550572 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qg6n\" (UniqueName: \"kubernetes.io/projected/afe6f307-ed98-475b-8fa0-8a49de31999c-kube-api-access-8qg6n\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:12.565552 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.565524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4t4\" (UniqueName: \"kubernetes.io/projected/b328de84-b319-40a6-968c-43ec62190177-kube-api-access-hh4t4\") pod \"volume-data-source-validator-7c6cbb6c87-4pmwr\" (UID: \"b328de84-b319-40a6-968c-43ec62190177\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" Apr 20 07:03:12.571819 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.571792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" Apr 20 07:03:12.612386 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.612346 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627545 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-config\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627618 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7fh\" (UniqueName: \"kubernetes.io/projected/7d3acfea-9550-4513-83ec-6e37b2e131a5-kube-api-access-wj7fh\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627646 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0134f121-e1b5-45c9-9b45-fc3777f00742-config-volume\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-default-certificate\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-trusted-ca\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7f7r\" (UniqueName: \"kubernetes.io/projected/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-kube-api-access-s7f7r\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627762 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627787 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627821 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-stats-auth\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627886 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.627967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-config\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.628005 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.628391 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.630739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.628453 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-config\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.628497 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0134f121-e1b5-45c9-9b45-fc3777f00742-config-volume\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629118 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629168 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l77pv\" (UniqueName: \"kubernetes.io/projected/cf743209-cbbe-4541-9005-3e25b48ff848-kube-api-access-l77pv\") pod \"network-check-source-8894fc9bd-zx2wb\" (UID: \"cf743209-cbbe-4541-9005-3e25b48ff848\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxfn\" (UniqueName: \"kubernetes.io/projected/0134f121-e1b5-45c9-9b45-fc3777f00742-kube-api-access-qbxfn\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629335 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629373 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20140ecd-e12f-48c7-b496-5b64ba19279d-trusted-ca\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdtc\" (UniqueName: \"kubernetes.io/projected/20140ecd-e12f-48c7-b496-5b64ba19279d-kube-api-access-pcdtc\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjtf\" (UniqueName: \"kubernetes.io/projected/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-kube-api-access-hxjtf\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629601 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0134f121-e1b5-45c9-9b45-fc3777f00742-tmp-dir\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629628 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629672 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629712 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629735 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20140ecd-e12f-48c7-b496-5b64ba19279d-serving-cert\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zm6q\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629802 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-config\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629903 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.629935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0134f121-e1b5-45c9-9b45-fc3777f00742-tmp-dir\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630026 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630040 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630128 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.130109489 +0000 UTC m=+35.773418636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630140 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630199 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.130182652 +0000 UTC m=+35.773491782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630253 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.630287 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.130276444 +0000 UTC m=+35.773585574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.630401 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.630977 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.632868 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.631415 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.633685 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.632997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.635510 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.635486 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20140ecd-e12f-48c7-b496-5b64ba19279d-serving-cert\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.654090 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.653969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zm6q\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.657204 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.656600 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" Apr 20 07:03:12.657204 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.656955 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:12.658686 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.658641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjtf\" (UniqueName: \"kubernetes.io/projected/84355a5f-ecc3-4e60-b0d3-c5dc15f059c6-kube-api-access-hxjtf\") pod \"service-ca-operator-d6fc45fc5-bhwn4\" (UID: \"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.659570 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.659506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdtc\" (UniqueName: \"kubernetes.io/projected/20140ecd-e12f-48c7-b496-5b64ba19279d-kube-api-access-pcdtc\") pod \"console-operator-9d4b6777b-2tj5v\" (UID: \"20140ecd-e12f-48c7-b496-5b64ba19279d\") " pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.664811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.664761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxfn\" (UniqueName: \"kubernetes.io/projected/0134f121-e1b5-45c9-9b45-fc3777f00742-kube-api-access-qbxfn\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:12.702135 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.699356 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:12.713885 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.713204 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" Apr 20 07:03:12.731758 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.731459 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l77pv\" (UniqueName: \"kubernetes.io/projected/cf743209-cbbe-4541-9005-3e25b48ff848-kube-api-access-l77pv\") pod \"network-check-source-8894fc9bd-zx2wb\" (UID: \"cf743209-cbbe-4541-9005-3e25b48ff848\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" Apr 20 07:03:12.731758 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.731517 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.731758 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.731631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.731758 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.731762 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.231741849 +0000 UTC m=+35.875050991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:12.732241 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.731945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7fh\" (UniqueName: \"kubernetes.io/projected/7d3acfea-9550-4513-83ec-6e37b2e131a5-kube-api-access-wj7fh\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.732241 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.731738 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:12.732241 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.731986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-default-certificate\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.732241 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.732013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7f7r\" (UniqueName: \"kubernetes.io/projected/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-kube-api-access-s7f7r\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.732241 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.732035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.232015198 +0000 UTC m=+35.875324329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:12.734878 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.733371 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-stats-auth\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.740636 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.735323 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:12.740636 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:12.735378 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.235360435 +0000 UTC m=+35.878669582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:12.740636 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.739301 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-default-certificate\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.740636 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.739464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-stats-auth\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.742951 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.733416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.750839 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.747605 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l77pv\" (UniqueName: \"kubernetes.io/projected/cf743209-cbbe-4541-9005-3e25b48ff848-kube-api-access-l77pv\") pod \"network-check-source-8894fc9bd-zx2wb\" (UID: \"cf743209-cbbe-4541-9005-3e25b48ff848\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" Apr 20 07:03:12.772168 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.768815 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7f7r\" (UniqueName: \"kubernetes.io/projected/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-kube-api-access-s7f7r\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:12.774638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.774577 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7fh\" (UniqueName: \"kubernetes.io/projected/7d3acfea-9550-4513-83ec-6e37b2e131a5-kube-api-access-wj7fh\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:12.800514 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.800031 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" Apr 20 07:03:12.820975 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.818301 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr"] Apr 20 07:03:12.829881 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:12.822512 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb328de84_b319_40a6_968c_43ec62190177.slice/crio-bde7367c0379688145294ee5fb3e204aec7eda69b72f83356607608d277848bb WatchSource:0}: Error finding container bde7367c0379688145294ee5fb3e204aec7eda69b72f83356607608d277848bb: Status 404 returned error can't find the container with id bde7367c0379688145294ee5fb3e204aec7eda69b72f83356607608d277848bb Apr 20 07:03:12.831946 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.831916 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jlbv6"] Apr 20 07:03:12.886859 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.886004 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9"] Apr 20 07:03:12.891115 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:12.890692 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3d8741_a673_4d3c_9bd2_323788b79b5a.slice/crio-adc1e58186260318358752f231ea002f5c7cdba1988c0c30c0735613a47505cf WatchSource:0}: Error finding container adc1e58186260318358752f231ea002f5c7cdba1988c0c30c0735613a47505cf: Status 404 returned error can't find the container with id adc1e58186260318358752f231ea002f5c7cdba1988c0c30c0735613a47505cf Apr 20 07:03:12.916049 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.916020 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2tj5v"] Apr 20 07:03:12.920327 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:12.920183 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20140ecd_e12f_48c7_b496_5b64ba19279d.slice/crio-fe242b8b93bb76b87bfd284c39dd5a157072369f37910953ce2a044ae2436bd3 WatchSource:0}: Error finding container fe242b8b93bb76b87bfd284c39dd5a157072369f37910953ce2a044ae2436bd3: Status 404 returned error can't find the container with id fe242b8b93bb76b87bfd284c39dd5a157072369f37910953ce2a044ae2436bd3 Apr 20 07:03:12.925557 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.924863 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4"] Apr 20 07:03:12.929341 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:12.929317 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84355a5f_ecc3_4e60_b0d3_c5dc15f059c6.slice/crio-32267f9306304c0a6024d1829969dc3998282d124d32dfb1b2d0661d957faecd WatchSource:0}: Error finding container 32267f9306304c0a6024d1829969dc3998282d124d32dfb1b2d0661d957faecd: Status 404 returned error can't find the container with id 32267f9306304c0a6024d1829969dc3998282d124d32dfb1b2d0661d957faecd Apr 20 07:03:12.969446 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:12.969417 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb"] Apr 20 07:03:12.972341 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:12.972313 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf743209_cbbe_4541_9005_3e25b48ff848.slice/crio-af583ff7c0d820ddf4d91fb9922c77a8d68803197987dd3c72170f1da721fdde WatchSource:0}: Error finding container af583ff7c0d820ddf4d91fb9922c77a8d68803197987dd3c72170f1da721fdde: Status 404 returned error can't find the container with id af583ff7c0d820ddf4d91fb9922c77a8d68803197987dd3c72170f1da721fdde Apr 20 07:03:13.044666 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.044630 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:13.044816 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.044705 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:13.044816 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.044783 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:13.044816 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.044787 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:13.044917 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.044861 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.044842248 +0000 UTC m=+36.688151376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:13.044917 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.044881 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.044871862 +0000 UTC m=+36.688180990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:13.074843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.074806 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" event={"ID":"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6","Type":"ContainerStarted","Data":"32267f9306304c0a6024d1829969dc3998282d124d32dfb1b2d0661d957faecd"} Apr 20 07:03:13.076130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.076101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" event={"ID":"de3d8741-a673-4d3c-9bd2-323788b79b5a","Type":"ContainerStarted","Data":"adc1e58186260318358752f231ea002f5c7cdba1988c0c30c0735613a47505cf"} Apr 20 07:03:13.077259 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.077234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" event={"ID":"31731af1-46d3-416a-98dc-3db15ceaab73","Type":"ContainerStarted","Data":"77902a243c991d52056f0b41f6f7cc1eeede44eb296dbf3f7db5ddbf46ee27df"} Apr 20 07:03:13.078361 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.078341 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" event={"ID":"b328de84-b319-40a6-968c-43ec62190177","Type":"ContainerStarted","Data":"bde7367c0379688145294ee5fb3e204aec7eda69b72f83356607608d277848bb"} Apr 20 07:03:13.079409 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.079389 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" event={"ID":"cf743209-cbbe-4541-9005-3e25b48ff848","Type":"ContainerStarted","Data":"af583ff7c0d820ddf4d91fb9922c77a8d68803197987dd3c72170f1da721fdde"} Apr 20 07:03:13.080482 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.080458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" event={"ID":"20140ecd-e12f-48c7-b496-5b64ba19279d","Type":"ContainerStarted","Data":"fe242b8b93bb76b87bfd284c39dd5a157072369f37910953ce2a044ae2436bd3"} Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.145208 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.145275 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.145332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.145562 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.145579 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.145639 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.145619641 +0000 UTC m=+36.788928772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.146088 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.146137 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.146123136 +0000 UTC m=+36.789432267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.146196 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:13.148119 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.146230 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.146218261 +0000 UTC m=+36.789527394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:13.246610 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.246576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:13.246804 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.246659 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:13.246804 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.246726 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:13.246804 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.246764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:13.246804 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.246794 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.24677339 +0000 UTC m=+36.890082538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:13.247015 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.246835 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.246825248 +0000 UTC m=+36.890134392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:13.247015 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.246923 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:13.247015 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:13.246952 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.246942395 +0000 UTC m=+36.890251537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:13.935460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.935428 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:13.935920 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.935428 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:13.936050 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.935428 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:13.941331 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.941307 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:03:13.942201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.942179 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:03:13.942404 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.942386 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g4l9z\"" Apr 20 07:03:13.943453 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:13.943435 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:03:14.055073 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.055018 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:14.055257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.055135 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:14.055257 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.055185 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:14.055363 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.055274 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.055255926 +0000 UTC m=+38.698565053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:14.055363 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.055290 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:14.055363 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.055348 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.055321998 +0000 UTC m=+38.698631126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:14.156319 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.156286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:14.156483 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156433 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:14.156483 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.156442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:14.156567 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.156489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:14.156567 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156450 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:14.156638 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156570 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:14.156638 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156509 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:14.156638 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156616 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.15657017 +0000 UTC m=+38.799879311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:14.156741 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156644 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.156626969 +0000 UTC m=+38.799936101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:14.156741 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.156661 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.156654683 +0000 UTC m=+38.799963810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:14.257195 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.257159 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:14.257377 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.257230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:14.257377 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:14.257316 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:14.257377 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.257358 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.257332861 +0000 UTC m=+38.900642001 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:14.257510 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.257409 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:14.257510 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.257419 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:14.257510 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.257476 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.257444779 +0000 UTC m=+38.900753907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:14.257510 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:14.257494 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:16.257483994 +0000 UTC m=+38.900793121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:16.077477 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.077248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:16.077942 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.077538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:16.077942 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.077426 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:16.077942 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.077754 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.077734011 +0000 UTC m=+42.721043144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:16.078244 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.077681 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:16.078244 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.078158 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.07814186 +0000 UTC m=+42.721450990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:16.092842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.092112 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="3945edee58584205870ddb444c8ebcbf396a2af46c61af9044ef27e1cab9fdb8" exitCode=0 Apr 20 07:03:16.092842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.092169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"3945edee58584205870ddb444c8ebcbf396a2af46c61af9044ef27e1cab9fdb8"} Apr 20 07:03:16.178376 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.178332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:16.178670 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.178619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:16.178770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.178696 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:16.179313 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179286 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:16.179313 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179316 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:16.179449 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179379 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.179360633 +0000 UTC m=+42.822669775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:16.179521 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179459 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:16.179521 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179495 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.179483199 +0000 UTC m=+42.822792328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:16.179622 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179554 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:16.179622 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.179590 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.179577385 +0000 UTC m=+42.822886513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:16.280165 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.280131 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:16.280387 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.280347 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:16.280462 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:16.280420 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:16.280597 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.280581 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:16.280667 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.280651 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.280631452 +0000 UTC m=+42.923940585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:16.281136 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.281117 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:16.281231 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.281175 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.281158221 +0000 UTC m=+42.924467350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:16.281296 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:16.281262 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.281251167 +0000 UTC m=+42.924560298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:17.099183 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:17.099141 2566 generic.go:358] "Generic (PLEG): container finished" podID="ade86a8b-7468-4238-b3ff-728e885f0d78" containerID="5c9c10ac199ab9463ccce8285fb8287d3a46a180bc85bb7ed3c5174b6130f4ed" exitCode=0 Apr 20 07:03:17.099628 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:17.099237 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerDied","Data":"5c9c10ac199ab9463ccce8285fb8287d3a46a180bc85bb7ed3c5174b6130f4ed"} Apr 20 07:03:17.796928 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:17.796890 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:17.800610 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:17.800583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/972f83a1-e686-4970-8f9a-46f92da2158f-original-pull-secret\") pod \"global-pull-secret-syncer-gxm4n\" (UID: \"972f83a1-e686-4970-8f9a-46f92da2158f\") " pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:17.848641 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:17.848601 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gxm4n" Apr 20 07:03:20.119856 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.119815 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:20.120344 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.119899 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:20.120344 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.119989 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:20.120344 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.120076 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:20.120344 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.120098 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.120055475 +0000 UTC m=+50.763364604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:20.120344 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.120140 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.120127075 +0000 UTC m=+50.763436204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:20.221154 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.221106 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:20.221332 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.221178 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:20.221332 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.221224 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:20.221332 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221248 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:20.221332 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221305 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.221291143 +0000 UTC m=+50.864600271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:20.221473 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221337 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:20.221473 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221348 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:20.221473 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221394 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.221382766 +0000 UTC m=+50.864691894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:20.221473 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221434 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:20.221473 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.221455 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.221449147 +0000 UTC m=+50.864758275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:20.322075 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.322031 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:20.322197 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.322127 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:20.322197 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.322177 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.322161264 +0000 UTC m=+50.965470391 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:20.322310 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.322224 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:20.322310 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.322247 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:20.322310 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.322301 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:20.322427 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.322320 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.322303193 +0000 UTC m=+50.965612326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:20.322427 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:20.322341 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:28.322330757 +0000 UTC m=+50.965639889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:20.707988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:20.707960 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gxm4n"] Apr 20 07:03:20.711796 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:20.711738 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972f83a1_e686_4970_8f9a_46f92da2158f.slice/crio-fabc1ba9195acf773021edbc0a35a268044ae833d07c001e03bd685dc8c59169 WatchSource:0}: Error finding container fabc1ba9195acf773021edbc0a35a268044ae833d07c001e03bd685dc8c59169: Status 404 returned error can't find the container with id fabc1ba9195acf773021edbc0a35a268044ae833d07c001e03bd685dc8c59169 Apr 20 07:03:21.109999 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.109912 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" event={"ID":"31731af1-46d3-416a-98dc-3db15ceaab73","Type":"ContainerStarted","Data":"13fe7a614cba52118a56acbe8d22c60353e92444603772930ddb5d3ae479ac46"} Apr 20 07:03:21.112008 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.111984 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" event={"ID":"b328de84-b319-40a6-968c-43ec62190177","Type":"ContainerStarted","Data":"09b9a5e8c4bc3285cf653b128ff67e4f3a09e7023ab557bdfc15617193522e9f"} Apr 20 07:03:21.113605 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.113553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gxm4n" event={"ID":"972f83a1-e686-4970-8f9a-46f92da2158f","Type":"ContainerStarted","Data":"fabc1ba9195acf773021edbc0a35a268044ae833d07c001e03bd685dc8c59169"} Apr 20 07:03:21.115038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.115004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" event={"ID":"cf743209-cbbe-4541-9005-3e25b48ff848","Type":"ContainerStarted","Data":"e70958d18d1fbcc66848cbd28d8b85d44b411cf1383d3ea2a7db4bcb4bb997b3"} Apr 20 07:03:21.116477 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.116462 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/0.log" Apr 20 07:03:21.116585 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.116492 2566 generic.go:358] "Generic (PLEG): container finished" podID="20140ecd-e12f-48c7-b496-5b64ba19279d" containerID="10e0e3cc87e641be533aa208816eee02a84b8b11e4cd0d7c12ebe9fc908d54ee" exitCode=255 Apr 20 07:03:21.116585 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.116554 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" event={"ID":"20140ecd-e12f-48c7-b496-5b64ba19279d","Type":"ContainerDied","Data":"10e0e3cc87e641be533aa208816eee02a84b8b11e4cd0d7c12ebe9fc908d54ee"} Apr 20 07:03:21.116960 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.116943 2566 scope.go:117] "RemoveContainer" containerID="10e0e3cc87e641be533aa208816eee02a84b8b11e4cd0d7c12ebe9fc908d54ee" Apr 20 07:03:21.120763 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.120743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" event={"ID":"ade86a8b-7468-4238-b3ff-728e885f0d78","Type":"ContainerStarted","Data":"3e5422405fe79b145b13ca9a78b90e9593f5d79061c6db1fd769149bf97cdbe6"} Apr 20 07:03:21.122996 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.122616 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" event={"ID":"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6","Type":"ContainerStarted","Data":"2ce5a1bfffcc2b7441ec37c2d1f0a2f36d2a9abafa3319679ddb751525fca817"} Apr 20 07:03:21.130303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.129714 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" event={"ID":"de3d8741-a673-4d3c-9bd2-323788b79b5a","Type":"ContainerStarted","Data":"a566252125bfb0086ec3f16ee828a58c8510288d0b33880a4a254fc7b03796cf"} Apr 20 07:03:21.142037 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.141996 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" podStartSLOduration=23.434138679 podStartE2EDuration="31.141985113s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.833870546 +0000 UTC m=+35.477179678" lastFinishedPulling="2026-04-20 07:03:20.541716969 +0000 UTC m=+43.185026112" observedRunningTime="2026-04-20 07:03:21.1411022 +0000 UTC m=+43.784411349" watchObservedRunningTime="2026-04-20 07:03:21.141985113 +0000 UTC m=+43.785294262" Apr 20 07:03:21.219380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.218668 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" podStartSLOduration=23.608204799 podStartE2EDuration="31.21865003s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.931279278 +0000 UTC m=+35.574588409" lastFinishedPulling="2026-04-20 07:03:20.541724499 +0000 UTC m=+43.185033640" observedRunningTime="2026-04-20 07:03:21.217327129 +0000 UTC m=+43.860636282" watchObservedRunningTime="2026-04-20 07:03:21.21865003 +0000 UTC m=+43.861959182" Apr 20 07:03:21.219380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.219123 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" podStartSLOduration=23.571121798 podStartE2EDuration="31.219112431s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.893749823 +0000 UTC m=+35.537058950" lastFinishedPulling="2026-04-20 07:03:20.541740454 +0000 UTC m=+43.185049583" observedRunningTime="2026-04-20 07:03:21.17526577 +0000 UTC m=+43.818574918" watchObservedRunningTime="2026-04-20 07:03:21.219112431 +0000 UTC m=+43.862421585" Apr 20 07:03:21.281187 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.278115 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zx2wb" podStartSLOduration=23.693366366 podStartE2EDuration="31.278095408s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.974186638 +0000 UTC m=+35.617495767" lastFinishedPulling="2026-04-20 07:03:20.558915668 +0000 UTC m=+43.202224809" observedRunningTime="2026-04-20 07:03:21.27723993 +0000 UTC m=+43.920549081" watchObservedRunningTime="2026-04-20 07:03:21.278095408 +0000 UTC m=+43.921404560" Apr 20 07:03:21.584540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.584469 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vtmbx" podStartSLOduration=9.016193435 podStartE2EDuration="43.584455041s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:02:40.687694141 +0000 UTC m=+3.331003268" lastFinishedPulling="2026-04-20 07:03:15.255955745 +0000 UTC m=+37.899264874" observedRunningTime="2026-04-20 07:03:21.517958517 +0000 UTC m=+44.161267668" watchObservedRunningTime="2026-04-20 07:03:21.584455041 +0000 UTC m=+44.227764190" Apr 20 07:03:21.585199 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:21.585164 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-4pmwr" podStartSLOduration=23.868009546 podStartE2EDuration="31.58515383s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.825114613 +0000 UTC m=+35.468423745" lastFinishedPulling="2026-04-20 07:03:20.542258894 +0000 UTC m=+43.185568029" observedRunningTime="2026-04-20 07:03:21.584089072 +0000 UTC m=+44.227398223" watchObservedRunningTime="2026-04-20 07:03:21.58515383 +0000 UTC m=+44.228462979" Apr 20 07:03:22.136189 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.136100 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.137145 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/0.log" Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.137181 2566 generic.go:358] "Generic (PLEG): container finished" podID="20140ecd-e12f-48c7-b496-5b64ba19279d" containerID="26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91" exitCode=255 Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.138033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" event={"ID":"20140ecd-e12f-48c7-b496-5b64ba19279d","Type":"ContainerDied","Data":"26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91"} Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.138098 2566 scope.go:117] "RemoveContainer" containerID="10e0e3cc87e641be533aa208816eee02a84b8b11e4cd0d7c12ebe9fc908d54ee" Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.138625 2566 scope.go:117] "RemoveContainer" containerID="26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91" Apr 20 07:03:22.139416 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:22.138801 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2tj5v_openshift-console-operator(20140ecd-e12f-48c7-b496-5b64ba19279d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" podUID="20140ecd-e12f-48c7-b496-5b64ba19279d" Apr 20 07:03:22.699582 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.699534 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:22.699582 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:22.699591 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:23.142231 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:23.142202 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:03:23.142700 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:23.142681 2566 scope.go:117] "RemoveContainer" containerID="26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91" Apr 20 07:03:23.142932 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:23.142886 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2tj5v_openshift-console-operator(20140ecd-e12f-48c7-b496-5b64ba19279d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" podUID="20140ecd-e12f-48c7-b496-5b64ba19279d" Apr 20 07:03:24.145974 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:24.145942 2566 scope.go:117] "RemoveContainer" containerID="26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91" Apr 20 07:03:24.146396 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:24.146180 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2tj5v_openshift-console-operator(20140ecd-e12f-48c7-b496-5b64ba19279d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" podUID="20140ecd-e12f-48c7-b496-5b64ba19279d" Apr 20 07:03:24.218641 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:24.218613 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-znmbh_13612f1f-99a2-46b0-abc6-8e800feca8e9/dns-node-resolver/0.log" Apr 20 07:03:25.153817 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:25.153417 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gxm4n" event={"ID":"972f83a1-e686-4970-8f9a-46f92da2158f","Type":"ContainerStarted","Data":"27dad21f997aafeb0201b3a4014c03a71c710913e682b873236c4fea93deb778"} Apr 20 07:03:25.202353 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:25.202303 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gxm4n" podStartSLOduration=35.951962602 podStartE2EDuration="40.202290225s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:03:20.714007981 +0000 UTC m=+43.357317109" lastFinishedPulling="2026-04-20 07:03:24.964335601 +0000 UTC m=+47.607644732" observedRunningTime="2026-04-20 07:03:25.194991479 +0000 UTC m=+47.838300628" watchObservedRunningTime="2026-04-20 07:03:25.202290225 +0000 UTC m=+47.845599376" Apr 20 07:03:25.289843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:25.289810 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pqdp9_4151df80-e101-4d3f-a89d-f0d2c0217a27/node-ca/0.log" Apr 20 07:03:28.200074 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.200039 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:28.200528 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.200130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:28.200528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.200195 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:03:28.200528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.200259 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:28.200528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.200272 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls podName:cbcdf682-b8ce-4e9c-a9db-021ed93dfda7 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.200252239 +0000 UTC m=+66.843561384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-psqkh" (UID: "cbcdf682-b8ce-4e9c-a9db-021ed93dfda7") : secret "samples-operator-tls" not found Apr 20 07:03:28.200528 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.200309 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls podName:afe6f307-ed98-475b-8fa0-8a49de31999c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.200298127 +0000 UTC m=+66.843607256 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9pj7p" (UID: "afe6f307-ed98-475b-8fa0-8a49de31999c") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:03:28.300796 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.300767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:28.300966 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.300869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:28.300966 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.300895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:28.300966 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.300935 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:28.300966 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.300956 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747586d489-dx7j9: secret "image-registry-tls" not found Apr 20 07:03:28.301168 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.300984 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:03:28.301168 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.300997 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:28.301168 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.301027 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls podName:42020703-9cbf-4ede-8527-45cfaca47cf6 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.301006408 +0000 UTC m=+66.944315552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls") pod "image-registry-747586d489-dx7j9" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6") : secret "image-registry-tls" not found Apr 20 07:03:28.301168 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.301048 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls podName:0134f121-e1b5-45c9-9b45-fc3777f00742 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.301035126 +0000 UTC m=+66.944344254 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls") pod "dns-default-tfcdm" (UID: "0134f121-e1b5-45c9-9b45-fc3777f00742") : secret "dns-default-metrics-tls" not found Apr 20 07:03:28.301168 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.301083 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert podName:58ffde8a-6f80-4470-a18a-15c64f4d0ce1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.301055531 +0000 UTC m=+66.944364659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fbft2" (UID: "58ffde8a-6f80-4470-a18a-15c64f4d0ce1") : secret "networking-console-plugin-cert" not found Apr 20 07:03:28.401342 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.401310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:28.401498 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.401379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:28.401498 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.401462 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:03:28.401498 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:28.401489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:28.401618 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.401515 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:28.401618 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.401524 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.401506357 +0000 UTC m=+67.044815502 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : secret "router-metrics-certs-default" not found Apr 20 07:03:28.401618 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.401576 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle podName:1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.401562976 +0000 UTC m=+67.044872104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle") pod "router-default-69d77dd9d6-68vvp" (UID: "1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff") : configmap references non-existent config key: service-ca.crt Apr 20 07:03:28.401618 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:28.401588 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert podName:7d3acfea-9550-4513-83ec-6e37b2e131a5 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:44.401582167 +0000 UTC m=+67.044891295 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert") pod "ingress-canary-7kxww" (UID: "7d3acfea-9550-4513-83ec-6e37b2e131a5") : secret "canary-serving-cert" not found Apr 20 07:03:37.075565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:37.075536 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vz4hl" Apr 20 07:03:38.935550 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:38.935510 2566 scope.go:117] "RemoveContainer" containerID="26a3cd2e11d072d4c06eff6628831180b9c2029c59234ddff2d949935cb60d91" Apr 20 07:03:39.194721 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:39.194651 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:03:39.194721 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:39.194711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" event={"ID":"20140ecd-e12f-48c7-b496-5b64ba19279d","Type":"ContainerStarted","Data":"d2ee4f203ce46be8c64bc3e9c1abbad8359ed05c8d0bf0bebc37711a59b6eafc"} Apr 20 07:03:39.194988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:39.194958 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:39.218496 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:39.218438 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" podStartSLOduration=41.597056686 podStartE2EDuration="49.218421434s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:12.922170703 +0000 UTC m=+35.565479832" lastFinishedPulling="2026-04-20 07:03:20.543535441 +0000 UTC m=+43.186844580" observedRunningTime="2026-04-20 07:03:39.217500904 +0000 UTC m=+61.860810054" watchObservedRunningTime="2026-04-20 07:03:39.218421434 +0000 UTC m=+61.861730585" Apr 20 07:03:39.469605 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:39.469533 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2tj5v" Apr 20 07:03:43.634181 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.634145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:43.634536 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.634255 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:43.636784 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.636766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9646\" (UniqueName: \"kubernetes.io/projected/d3c0457e-6ada-43f2-92ce-2786a4e2c17b-kube-api-access-s9646\") pod \"network-check-target-7cnrw\" (UID: \"d3c0457e-6ada-43f2-92ce-2786a4e2c17b\") " pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:43.637054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.637040 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:03:43.646919 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.646898 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc07195a-cdd1-494d-b741-97e9b77b3f6d-metrics-certs\") pod \"network-metrics-daemon-h8v9g\" (UID: \"fc07195a-cdd1-494d-b741-97e9b77b3f6d\") " pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:43.659701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.659681 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g4l9z\"" Apr 20 07:03:43.664617 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.664601 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:03:43.667924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.667911 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:43.672548 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.672528 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8v9g" Apr 20 07:03:43.792541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.792510 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7cnrw"] Apr 20 07:03:43.796263 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:43.796233 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c0457e_6ada_43f2_92ce_2786a4e2c17b.slice/crio-b7583d802299246ad973a1adf028931e53a1ad3b093069698b9450af7b2aed64 WatchSource:0}: Error finding container b7583d802299246ad973a1adf028931e53a1ad3b093069698b9450af7b2aed64: Status 404 returned error can't find the container with id b7583d802299246ad973a1adf028931e53a1ad3b093069698b9450af7b2aed64 Apr 20 07:03:43.808649 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:43.808624 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h8v9g"] Apr 20 07:03:43.811189 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:43.811167 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc07195a_cdd1_494d_b741_97e9b77b3f6d.slice/crio-39de435864eaee30f51edf1311df71755315befc93b1c65926fc2c976fc5cf39 WatchSource:0}: Error finding container 39de435864eaee30f51edf1311df71755315befc93b1c65926fc2c976fc5cf39: Status 404 returned error can't find the container with id 39de435864eaee30f51edf1311df71755315befc93b1c65926fc2c976fc5cf39 Apr 20 07:03:44.208107 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.207996 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8v9g" event={"ID":"fc07195a-cdd1-494d-b741-97e9b77b3f6d","Type":"ContainerStarted","Data":"39de435864eaee30f51edf1311df71755315befc93b1c65926fc2c976fc5cf39"} Apr 20 07:03:44.209197 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.209169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7cnrw" event={"ID":"d3c0457e-6ada-43f2-92ce-2786a4e2c17b","Type":"ContainerStarted","Data":"2cc96836a0c2d22071722fedb43cd5bdd64f63f8d2768bcd0bed31c071203084"} Apr 20 07:03:44.209308 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.209203 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7cnrw" event={"ID":"d3c0457e-6ada-43f2-92ce-2786a4e2c17b","Type":"ContainerStarted","Data":"b7583d802299246ad973a1adf028931e53a1ad3b093069698b9450af7b2aed64"} Apr 20 07:03:44.209346 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.209304 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:03:44.229498 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.229459 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7cnrw" podStartSLOduration=66.229446373 podStartE2EDuration="1m6.229446373s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:44.2289208 +0000 UTC m=+66.872229947" watchObservedRunningTime="2026-04-20 07:03:44.229446373 +0000 UTC m=+66.872755517" Apr 20 07:03:44.240195 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.240167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:44.240325 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.240217 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:44.242781 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.242753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe6f307-ed98-475b-8fa0-8a49de31999c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9pj7p\" (UID: \"afe6f307-ed98-475b-8fa0-8a49de31999c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:44.242869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.242753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbcdf682-b8ce-4e9c-a9db-021ed93dfda7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-psqkh\" (UID: \"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:44.341653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.341617 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:44.341653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.341651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:44.341878 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.341676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:44.344090 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.344045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"image-registry-747586d489-dx7j9\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:44.344090 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.344082 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0134f121-e1b5-45c9-9b45-fc3777f00742-metrics-tls\") pod \"dns-default-tfcdm\" (UID: \"0134f121-e1b5-45c9-9b45-fc3777f00742\") " pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:44.344233 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.344125 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58ffde8a-6f80-4470-a18a-15c64f4d0ce1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fbft2\" (UID: \"58ffde8a-6f80-4470-a18a-15c64f4d0ce1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:44.387237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.387215 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-j66ld\"" Apr 20 07:03:44.394675 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.394652 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" Apr 20 07:03:44.442709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.442436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:44.442709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.442493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:44.442709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.442528 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:44.444293 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.443613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-service-ca-bundle\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:44.451985 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.450481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff-metrics-certs\") pod \"router-default-69d77dd9d6-68vvp\" (UID: \"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff\") " pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:44.452762 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.452727 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d3acfea-9550-4513-83ec-6e37b2e131a5-cert\") pod \"ingress-canary-7kxww\" (UID: \"7d3acfea-9550-4513-83ec-6e37b2e131a5\") " pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:44.457257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.457225 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2ptdz\"" Apr 20 07:03:44.461635 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.461522 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" Apr 20 07:03:44.484929 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.484461 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w2xpz\"" Apr 20 07:03:44.485088 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.485005 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:44.542648 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.542435 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhcpq\"" Apr 20 07:03:44.549168 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.548767 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:44.554527 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.554042 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh"] Apr 20 07:03:44.558435 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.558225 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fcl8n\"" Apr 20 07:03:44.560649 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.560607 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" Apr 20 07:03:44.581277 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.580330 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-75ldp\"" Apr 20 07:03:44.590937 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.590129 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7kxww" Apr 20 07:03:44.623174 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.620690 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6rk52\"" Apr 20 07:03:44.629674 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.628218 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:44.670709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.670645 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p"] Apr 20 07:03:44.731384 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.731300 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:03:44.746008 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:44.745456 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42020703_9cbf_4ede_8527_45cfaca47cf6.slice/crio-2351609407b3e9467baacf8750b896aebcc707284e8b5e5b764d77cdf0e0e89a WatchSource:0}: Error finding container 2351609407b3e9467baacf8750b896aebcc707284e8b5e5b764d77cdf0e0e89a: Status 404 returned error can't find the container with id 2351609407b3e9467baacf8750b896aebcc707284e8b5e5b764d77cdf0e0e89a Apr 20 07:03:44.780041 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.779991 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fbft2"] Apr 20 07:03:44.785184 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:44.785145 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ffde8a_6f80_4470_a18a_15c64f4d0ce1.slice/crio-258e2f1d5e9b9dc6158ada0ffd2a36899bfeb81705cd8c31c131c86b2fe0f150 WatchSource:0}: Error finding container 258e2f1d5e9b9dc6158ada0ffd2a36899bfeb81705cd8c31c131c86b2fe0f150: Status 404 returned error can't find the container with id 258e2f1d5e9b9dc6158ada0ffd2a36899bfeb81705cd8c31c131c86b2fe0f150 Apr 20 07:03:44.806807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:44.806518 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tfcdm"] Apr 20 07:03:44.810302 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:44.810272 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0134f121_e1b5_45c9_9b45_fc3777f00742.slice/crio-18968a3d95ea706613f3c1b37cd3fd92c93424636cf12f4dcab4a53a260b9175 WatchSource:0}: Error finding container 18968a3d95ea706613f3c1b37cd3fd92c93424636cf12f4dcab4a53a260b9175: Status 404 returned error can't find the container with id 18968a3d95ea706613f3c1b37cd3fd92c93424636cf12f4dcab4a53a260b9175 Apr 20 07:03:45.044027 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.043992 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7kxww"] Apr 20 07:03:45.045144 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.045118 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69d77dd9d6-68vvp"] Apr 20 07:03:45.214461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.214421 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tfcdm" event={"ID":"0134f121-e1b5-45c9-9b45-fc3777f00742","Type":"ContainerStarted","Data":"18968a3d95ea706613f3c1b37cd3fd92c93424636cf12f4dcab4a53a260b9175"} Apr 20 07:03:45.215786 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.215722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" event={"ID":"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7","Type":"ContainerStarted","Data":"85df6cd840f0008998cdb12f8eb061dfaa4104c545f409ee8142ef451de10ef5"} Apr 20 07:03:45.217088 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.217018 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" event={"ID":"58ffde8a-6f80-4470-a18a-15c64f4d0ce1","Type":"ContainerStarted","Data":"258e2f1d5e9b9dc6158ada0ffd2a36899bfeb81705cd8c31c131c86b2fe0f150"} Apr 20 07:03:45.218608 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.218578 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747586d489-dx7j9" event={"ID":"42020703-9cbf-4ede-8527-45cfaca47cf6","Type":"ContainerStarted","Data":"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e"} Apr 20 07:03:45.218716 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.218616 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747586d489-dx7j9" event={"ID":"42020703-9cbf-4ede-8527-45cfaca47cf6","Type":"ContainerStarted","Data":"2351609407b3e9467baacf8750b896aebcc707284e8b5e5b764d77cdf0e0e89a"} Apr 20 07:03:45.218716 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.218690 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:03:45.219792 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.219769 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" event={"ID":"afe6f307-ed98-475b-8fa0-8a49de31999c","Type":"ContainerStarted","Data":"37a7742f677dfe2ca291fde94c994d02e3da4b623e450c5f3cb30b342f484d6e"} Apr 20 07:03:45.233848 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:45.233826 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d3acfea_9550_4513_83ec_6e37b2e131a5.slice/crio-d7316afb464d7379580f1d1f614572a5aaf54e9643a9f0b556dc45ea08dfe81c WatchSource:0}: Error finding container d7316afb464d7379580f1d1f614572a5aaf54e9643a9f0b556dc45ea08dfe81c: Status 404 returned error can't find the container with id d7316afb464d7379580f1d1f614572a5aaf54e9643a9f0b556dc45ea08dfe81c Apr 20 07:03:45.235023 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:45.235001 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dbfa2d1_c6ce_45c6_9b1d_cbd35a6ffaff.slice/crio-0bd252723760cc8f9911a3ba58d1a54f467e1d9325ff447d785d96e404ea3b6a WatchSource:0}: Error finding container 0bd252723760cc8f9911a3ba58d1a54f467e1d9325ff447d785d96e404ea3b6a: Status 404 returned error can't find the container with id 0bd252723760cc8f9911a3ba58d1a54f467e1d9325ff447d785d96e404ea3b6a Apr 20 07:03:45.245499 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:45.245449 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-747586d489-dx7j9" podStartSLOduration=67.245430686 podStartE2EDuration="1m7.245430686s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:45.244140782 +0000 UTC m=+67.887449933" watchObservedRunningTime="2026-04-20 07:03:45.245430686 +0000 UTC m=+67.888739836" Apr 20 07:03:46.226117 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.226026 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7kxww" event={"ID":"7d3acfea-9550-4513-83ec-6e37b2e131a5","Type":"ContainerStarted","Data":"d7316afb464d7379580f1d1f614572a5aaf54e9643a9f0b556dc45ea08dfe81c"} Apr 20 07:03:46.229311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.229043 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" event={"ID":"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff","Type":"ContainerStarted","Data":"b611a42339efa21f521c31e54850be58847db18fa2cd37b137493d799e808c2b"} Apr 20 07:03:46.229311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.229098 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" event={"ID":"1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff","Type":"ContainerStarted","Data":"0bd252723760cc8f9911a3ba58d1a54f467e1d9325ff447d785d96e404ea3b6a"} Apr 20 07:03:46.233782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.233671 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8v9g" event={"ID":"fc07195a-cdd1-494d-b741-97e9b77b3f6d","Type":"ContainerStarted","Data":"f3b669cdfe910dfd3af48e5cd7fb261e1222cef79aff003444d6d827c0e1f5d3"} Apr 20 07:03:46.233782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.233700 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8v9g" event={"ID":"fc07195a-cdd1-494d-b741-97e9b77b3f6d","Type":"ContainerStarted","Data":"a566029db299740924e2b643daaafbe2f5e4e726d5dc91b8c5be75adfb53fe98"} Apr 20 07:03:46.260479 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.259517 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" podStartSLOduration=56.259498676 podStartE2EDuration="56.259498676s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:46.259215583 +0000 UTC m=+68.902524734" watchObservedRunningTime="2026-04-20 07:03:46.259498676 +0000 UTC m=+68.902807829" Apr 20 07:03:46.278677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.278284 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h8v9g" podStartSLOduration=66.806159462 podStartE2EDuration="1m8.278266641s" podCreationTimestamp="2026-04-20 07:02:38 +0000 UTC" firstStartedPulling="2026-04-20 07:03:43.812859941 +0000 UTC m=+66.456169070" lastFinishedPulling="2026-04-20 07:03:45.284967115 +0000 UTC m=+67.928276249" observedRunningTime="2026-04-20 07:03:46.276673372 +0000 UTC m=+68.919982524" watchObservedRunningTime="2026-04-20 07:03:46.278266641 +0000 UTC m=+68.921575792" Apr 20 07:03:46.629534 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.629389 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:46.632913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:46.632876 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:47.237466 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:47.237418 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" event={"ID":"58ffde8a-6f80-4470-a18a-15c64f4d0ce1","Type":"ContainerStarted","Data":"2d4b64449afbcd41ad37a7a2864a92c2da08a55969eaf42b737c0dd165fd4eff"} Apr 20 07:03:47.237896 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:47.237868 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:47.239383 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:47.239344 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69d77dd9d6-68vvp" Apr 20 07:03:47.263473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:47.263423 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fbft2" podStartSLOduration=39.734512721 podStartE2EDuration="41.263405537s" podCreationTimestamp="2026-04-20 07:03:06 +0000 UTC" firstStartedPulling="2026-04-20 07:03:44.789104194 +0000 UTC m=+67.432413336" lastFinishedPulling="2026-04-20 07:03:46.317997018 +0000 UTC m=+68.961306152" observedRunningTime="2026-04-20 07:03:47.260964582 +0000 UTC m=+69.904273732" watchObservedRunningTime="2026-04-20 07:03:47.263405537 +0000 UTC m=+69.906714688" Apr 20 07:03:49.244265 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.244225 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" event={"ID":"afe6f307-ed98-475b-8fa0-8a49de31999c","Type":"ContainerStarted","Data":"6043dc70be6d7ae31c5670c01f397644e459259bbbf7cca81001000f2fa5b13e"} Apr 20 07:03:49.245784 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.245758 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tfcdm" event={"ID":"0134f121-e1b5-45c9-9b45-fc3777f00742","Type":"ContainerStarted","Data":"fc0c3955a513fe687b9ce03e6c5851e0262e43abb7692fa31f526672b368772f"} Apr 20 07:03:49.245893 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.245792 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tfcdm" event={"ID":"0134f121-e1b5-45c9-9b45-fc3777f00742","Type":"ContainerStarted","Data":"9d474322c2c62fe135c8eb877234c96ef02cb311c2222118cf435e4477ff4817"} Apr 20 07:03:49.245893 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.245826 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:49.247397 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.247379 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" event={"ID":"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7","Type":"ContainerStarted","Data":"21bc3137b80692dd2210e15ecd240463adc4ee8b94c86af1caa598e46d0f6f9d"} Apr 20 07:03:49.247509 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.247400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" event={"ID":"cbcdf682-b8ce-4e9c-a9db-021ed93dfda7","Type":"ContainerStarted","Data":"7d32076b1a82012bba456a33d7939fb48afdf951cd47d6270d1b6bc26e5cd58e"} Apr 20 07:03:49.248674 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.248656 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7kxww" event={"ID":"7d3acfea-9550-4513-83ec-6e37b2e131a5","Type":"ContainerStarted","Data":"d64b4a69971f3eeece514b574108f93ee937e91e00cd094b97c8a9b31781d015"} Apr 20 07:03:49.268260 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.268217 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9pj7p" podStartSLOduration=55.660576908 podStartE2EDuration="59.268207303s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:44.679853011 +0000 UTC m=+67.323162154" lastFinishedPulling="2026-04-20 07:03:48.287483407 +0000 UTC m=+70.930792549" observedRunningTime="2026-04-20 07:03:49.268040998 +0000 UTC m=+71.911350148" watchObservedRunningTime="2026-04-20 07:03:49.268207303 +0000 UTC m=+71.911516473" Apr 20 07:03:49.287742 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.287702 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-psqkh" podStartSLOduration=55.653318326 podStartE2EDuration="59.287688399s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:44.650320232 +0000 UTC m=+67.293629375" lastFinishedPulling="2026-04-20 07:03:48.284690318 +0000 UTC m=+70.927999448" observedRunningTime="2026-04-20 07:03:49.287046685 +0000 UTC m=+71.930355832" watchObservedRunningTime="2026-04-20 07:03:49.287688399 +0000 UTC m=+71.930997550" Apr 20 07:03:49.309433 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.309387 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tfcdm" podStartSLOduration=33.83921162 podStartE2EDuration="37.309375327s" podCreationTimestamp="2026-04-20 07:03:12 +0000 UTC" firstStartedPulling="2026-04-20 07:03:44.812868116 +0000 UTC m=+67.456177246" lastFinishedPulling="2026-04-20 07:03:48.283031825 +0000 UTC m=+70.926340953" observedRunningTime="2026-04-20 07:03:49.307878125 +0000 UTC m=+71.951187277" watchObservedRunningTime="2026-04-20 07:03:49.309375327 +0000 UTC m=+71.952684501" Apr 20 07:03:49.910687 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.910637 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7kxww" podStartSLOduration=34.862672198 podStartE2EDuration="37.910621686s" podCreationTimestamp="2026-04-20 07:03:12 +0000 UTC" firstStartedPulling="2026-04-20 07:03:45.235654077 +0000 UTC m=+67.878963217" lastFinishedPulling="2026-04-20 07:03:48.283603562 +0000 UTC m=+70.926912705" observedRunningTime="2026-04-20 07:03:49.330715864 +0000 UTC m=+71.974025013" watchObservedRunningTime="2026-04-20 07:03:49.910621686 +0000 UTC m=+72.553930880" Apr 20 07:03:49.911394 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.911369 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8"] Apr 20 07:03:49.914579 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.914560 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:49.922654 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.922621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 07:03:49.922765 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.922651 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 07:03:49.922765 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.922635 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hm9rg\"" Apr 20 07:03:49.922886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.922868 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 07:03:49.922997 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.922981 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 07:03:49.988046 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.988013 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:49.988218 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:49.988086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9t4\" (UniqueName: \"kubernetes.io/projected/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-kube-api-access-4z9t4\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.009278 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.009237 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8"] Apr 20 07:03:50.088989 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.088952 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9t4\" (UniqueName: \"kubernetes.io/projected/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-kube-api-access-4z9t4\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.089151 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.089016 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.091529 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.091506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.193536 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.193464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9t4\" (UniqueName: \"kubernetes.io/projected/bda80094-239a-4b9f-9b2e-b7b02c3ffbae-kube-api-access-4z9t4\") pod \"managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8\" (UID: \"bda80094-239a-4b9f-9b2e-b7b02c3ffbae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.207273 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.207245 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-mwk27"] Apr 20 07:03:50.211348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.211328 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:03:50.229393 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.229373 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 07:03:50.229485 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.229376 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 07:03:50.229548 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.229530 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-gg5bk\"" Apr 20 07:03:50.237611 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.237592 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" Apr 20 07:03:50.290003 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.289966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-995dh\" (UniqueName: \"kubernetes.io/projected/7cc62a1d-19e0-48f5-b52d-9344c373a4eb-kube-api-access-995dh\") pod \"downloads-6bcc868b7-mwk27\" (UID: \"7cc62a1d-19e0-48f5-b52d-9344c373a4eb\") " pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:03:50.302609 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.302578 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822"] Apr 20 07:03:50.307338 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.307315 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:50.312135 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.312115 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-r22c7\"" Apr 20 07:03:50.316246 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.316226 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 07:03:50.352255 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.352227 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fscsr"] Apr 20 07:03:50.355613 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.355591 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-mwk27"] Apr 20 07:03:50.355727 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.355711 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.380785 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:50.380746 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" type="*v1.ConfigMap" Apr 20 07:03:50.380785 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:50.380753 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-sa-dockercfg-nfwjb\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nfwjb\"" type="*v1.Secret" Apr 20 07:03:50.380972 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:50.380828 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-tls\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" type="*v1.Secret" Apr 20 07:03:50.391303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.391279 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ad660d5f-22ed-44cf-8958-0db2957fe5fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x2822\" (UID: \"ad660d5f-22ed-44cf-8958-0db2957fe5fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:50.391497 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.391478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-995dh\" (UniqueName: \"kubernetes.io/projected/7cc62a1d-19e0-48f5-b52d-9344c373a4eb-kube-api-access-995dh\") pod \"downloads-6bcc868b7-mwk27\" (UID: \"7cc62a1d-19e0-48f5-b52d-9344c373a4eb\") " pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:03:50.416578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.416541 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822"] Apr 20 07:03:50.436954 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.436929 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-995dh\" (UniqueName: \"kubernetes.io/projected/7cc62a1d-19e0-48f5-b52d-9344c373a4eb-kube-api-access-995dh\") pod \"downloads-6bcc868b7-mwk27\" (UID: \"7cc62a1d-19e0-48f5-b52d-9344c373a4eb\") " pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:03:50.454228 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.454162 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fscsr"] Apr 20 07:03:50.492917 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.492890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.493047 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.492939 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eae346ec-981a-4e25-9e27-86b42cb2f0f1-crio-socket\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.493047 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.492997 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eae346ec-981a-4e25-9e27-86b42cb2f0f1-data-volume\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.493047 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.493000 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8"] Apr 20 07:03:50.493047 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.493029 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vhw\" (UniqueName: \"kubernetes.io/projected/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-api-access-h6vhw\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.493282 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.493128 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ad660d5f-22ed-44cf-8958-0db2957fe5fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x2822\" (UID: \"ad660d5f-22ed-44cf-8958-0db2957fe5fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:50.493282 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.493186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.494425 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:50.494400 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbda80094_239a_4b9f_9b2e_b7b02c3ffbae.slice/crio-0102c3d1f22b1a4bab32fcfb76c79de1c09473ab6e4be0f7f8a178fcfc4542c8 WatchSource:0}: Error finding container 0102c3d1f22b1a4bab32fcfb76c79de1c09473ab6e4be0f7f8a178fcfc4542c8: Status 404 returned error can't find the container with id 0102c3d1f22b1a4bab32fcfb76c79de1c09473ab6e4be0f7f8a178fcfc4542c8 Apr 20 07:03:50.495976 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.495956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ad660d5f-22ed-44cf-8958-0db2957fe5fb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x2822\" (UID: \"ad660d5f-22ed-44cf-8958-0db2957fe5fb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:50.520717 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.520690 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:03:50.594338 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594295 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594355 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eae346ec-981a-4e25-9e27-86b42cb2f0f1-crio-socket\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eae346ec-981a-4e25-9e27-86b42cb2f0f1-data-volume\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vhw\" (UniqueName: \"kubernetes.io/projected/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-api-access-h6vhw\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594679 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eae346ec-981a-4e25-9e27-86b42cb2f0f1-crio-socket\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.594818 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.594796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eae346ec-981a-4e25-9e27-86b42cb2f0f1-data-volume\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.617248 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.617162 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:50.680207 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.680161 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vhw\" (UniqueName: \"kubernetes.io/projected/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-api-access-h6vhw\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:50.803174 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:50.803129 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc62a1d_19e0_48f5_b52d_9344c373a4eb.slice/crio-659e256af135603c2f64afbdd8262aa74e6b81bdb2d3d65c5f9efadeadf0adb8 WatchSource:0}: Error finding container 659e256af135603c2f64afbdd8262aa74e6b81bdb2d3d65c5f9efadeadf0adb8: Status 404 returned error can't find the container with id 659e256af135603c2f64afbdd8262aa74e6b81bdb2d3d65c5f9efadeadf0adb8 Apr 20 07:03:50.851977 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.851942 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:03:50.855232 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.855209 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-mwk27"] Apr 20 07:03:50.855333 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.855311 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.889226 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889191 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 07:03:50.889414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889248 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 07:03:50.889414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889308 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 07:03:50.889414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889337 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 07:03:50.889717 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889703 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-n4xjx\"" Apr 20 07:03:50.889756 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.889721 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 07:03:50.920586 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.920552 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:03:50.928639 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.928604 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822"] Apr 20 07:03:50.930441 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:50.930416 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad660d5f_22ed_44cf_8958_0db2957fe5fb.slice/crio-7afd65695225dfa3145610bc5fbcbb71baad8a0d0143118ddd44f63b21b34c9d WatchSource:0}: Error finding container 7afd65695225dfa3145610bc5fbcbb71baad8a0d0143118ddd44f63b21b34c9d: Status 404 returned error can't find the container with id 7afd65695225dfa3145610bc5fbcbb71baad8a0d0143118ddd44f63b21b34c9d Apr 20 07:03:50.998398 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998363 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.998535 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998414 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.998535 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998445 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.998535 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998523 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.998646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998551 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:50.998646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:50.998570 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqm7t\" (UniqueName: \"kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099222 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099438 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqm7t\" (UniqueName: \"kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099488 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099473 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099551 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.099597 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.099581 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.100278 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.100254 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.100521 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.100497 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.100616 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.100594 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.101941 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.101921 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.102038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.102019 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.116721 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.116698 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqm7t\" (UniqueName: \"kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t\") pod \"console-66865cf58f-tpps7\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.164341 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.164296 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:03:51.258829 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.258761 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-mwk27" event={"ID":"7cc62a1d-19e0-48f5-b52d-9344c373a4eb","Type":"ContainerStarted","Data":"659e256af135603c2f64afbdd8262aa74e6b81bdb2d3d65c5f9efadeadf0adb8"} Apr 20 07:03:51.260930 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.260895 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" event={"ID":"bda80094-239a-4b9f-9b2e-b7b02c3ffbae","Type":"ContainerStarted","Data":"0102c3d1f22b1a4bab32fcfb76c79de1c09473ab6e4be0f7f8a178fcfc4542c8"} Apr 20 07:03:51.262474 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.262437 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" event={"ID":"ad660d5f-22ed-44cf-8958-0db2957fe5fb","Type":"ContainerStarted","Data":"7afd65695225dfa3145610bc5fbcbb71baad8a0d0143118ddd44f63b21b34c9d"} Apr 20 07:03:51.392432 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.392315 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:03:51.398827 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:51.398784 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a192bbd_08cd_46e9_a6b0_103feb11b910.slice/crio-872d922d928395fe956d40e79de3a2d30b794f45f82198fce9eb642f303e3484 WatchSource:0}: Error finding container 872d922d928395fe956d40e79de3a2d30b794f45f82198fce9eb642f303e3484: Status 404 returned error can't find the container with id 872d922d928395fe956d40e79de3a2d30b794f45f82198fce9eb642f303e3484 Apr 20 07:03:51.549196 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.548909 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nfwjb\"" Apr 20 07:03:51.595548 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:51.595513 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: failed to sync secret cache: timed out waiting for the condition Apr 20 07:03:51.595693 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:51.595612 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls podName:eae346ec-981a-4e25-9e27-86b42cb2f0f1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:52.095587116 +0000 UTC m=+74.738896263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fscsr" (UID: "eae346ec-981a-4e25-9e27-86b42cb2f0f1") : failed to sync secret cache: timed out waiting for the condition Apr 20 07:03:51.595981 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:51.595871 2566 configmap.go:193] Couldn't get configMap openshift-insights/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Apr 20 07:03:51.595981 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:03:51.595949 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm podName:eae346ec-981a-4e25-9e27-86b42cb2f0f1 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:52.095929678 +0000 UTC m=+74.739238809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-rbac-proxy-cm" (UniqueName: "kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm") pod "insights-runtime-extractor-fscsr" (UID: "eae346ec-981a-4e25-9e27-86b42cb2f0f1") : failed to sync configmap cache: timed out waiting for the condition Apr 20 07:03:51.904715 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.904681 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:03:51.972484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:51.972447 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:03:52.107454 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.107413 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:52.107611 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.107477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:52.108030 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.108010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eae346ec-981a-4e25-9e27-86b42cb2f0f1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:52.110349 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.110317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eae346ec-981a-4e25-9e27-86b42cb2f0f1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fscsr\" (UID: \"eae346ec-981a-4e25-9e27-86b42cb2f0f1\") " pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:52.166230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.166133 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fscsr" Apr 20 07:03:52.268314 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:52.268265 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66865cf58f-tpps7" event={"ID":"9a192bbd-08cd-46e9-a6b0-103feb11b910","Type":"ContainerStarted","Data":"872d922d928395fe956d40e79de3a2d30b794f45f82198fce9eb642f303e3484"} Apr 20 07:03:55.084637 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.084427 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fscsr"] Apr 20 07:03:55.088129 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:03:55.088098 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeae346ec_981a_4e25_9e27_86b42cb2f0f1.slice/crio-10ea546803df8e11a71b87d66eea279e5450e4dc439eb0e0306cc9360db65d89 WatchSource:0}: Error finding container 10ea546803df8e11a71b87d66eea279e5450e4dc439eb0e0306cc9360db65d89: Status 404 returned error can't find the container with id 10ea546803df8e11a71b87d66eea279e5450e4dc439eb0e0306cc9360db65d89 Apr 20 07:03:55.284299 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.284261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" event={"ID":"bda80094-239a-4b9f-9b2e-b7b02c3ffbae","Type":"ContainerStarted","Data":"d7e4035e8b648897ced14f85c41c0f17eeb74653b341ac3543f499277bfb01bb"} Apr 20 07:03:55.285826 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.285791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66865cf58f-tpps7" event={"ID":"9a192bbd-08cd-46e9-a6b0-103feb11b910","Type":"ContainerStarted","Data":"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7"} Apr 20 07:03:55.287203 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.287177 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" event={"ID":"ad660d5f-22ed-44cf-8958-0db2957fe5fb","Type":"ContainerStarted","Data":"def9df218001f46a7561d130d847148085a67226f0960d217ac69eb4df525f11"} Apr 20 07:03:55.287414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.287378 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:55.288884 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.288850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fscsr" event={"ID":"eae346ec-981a-4e25-9e27-86b42cb2f0f1","Type":"ContainerStarted","Data":"d477377865c0cadc1f9c860316a6b1d0dc6ed0a07db9df7460e8561f8af6ff32"} Apr 20 07:03:55.288993 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.288888 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fscsr" event={"ID":"eae346ec-981a-4e25-9e27-86b42cb2f0f1","Type":"ContainerStarted","Data":"10ea546803df8e11a71b87d66eea279e5450e4dc439eb0e0306cc9360db65d89"} Apr 20 07:03:55.293465 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.293438 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" Apr 20 07:03:55.369887 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.369824 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bd8b68cf7-bztq8" podStartSLOduration=1.969604686 podStartE2EDuration="6.369808815s" podCreationTimestamp="2026-04-20 07:03:49 +0000 UTC" firstStartedPulling="2026-04-20 07:03:50.496399678 +0000 UTC m=+73.139708821" lastFinishedPulling="2026-04-20 07:03:54.896603803 +0000 UTC m=+77.539912950" observedRunningTime="2026-04-20 07:03:55.325078536 +0000 UTC m=+77.968387684" watchObservedRunningTime="2026-04-20 07:03:55.369808815 +0000 UTC m=+78.013117978" Apr 20 07:03:55.370109 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:55.370052 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x2822" podStartSLOduration=1.410408441 podStartE2EDuration="5.370044657s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:50.932527335 +0000 UTC m=+73.575836466" lastFinishedPulling="2026-04-20 07:03:54.892163535 +0000 UTC m=+77.535472682" observedRunningTime="2026-04-20 07:03:55.367925522 +0000 UTC m=+78.011234674" watchObservedRunningTime="2026-04-20 07:03:55.370044657 +0000 UTC m=+78.013353838" Apr 20 07:03:57.298495 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:57.298445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fscsr" event={"ID":"eae346ec-981a-4e25-9e27-86b42cb2f0f1","Type":"ContainerStarted","Data":"65dfe84cea96bd9cfaa606980192eec687a77ae11ffa3852118306123984f04d"} Apr 20 07:03:57.972548 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:57.972485 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66865cf58f-tpps7" podStartSLOduration=4.472307039 podStartE2EDuration="7.972465467s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:51.402387378 +0000 UTC m=+74.045696510" lastFinishedPulling="2026-04-20 07:03:54.902545804 +0000 UTC m=+77.545854938" observedRunningTime="2026-04-20 07:03:55.420556139 +0000 UTC m=+78.063865301" watchObservedRunningTime="2026-04-20 07:03:57.972465467 +0000 UTC m=+80.615774619" Apr 20 07:03:58.304303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:58.304260 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fscsr" event={"ID":"eae346ec-981a-4e25-9e27-86b42cb2f0f1","Type":"ContainerStarted","Data":"0c3e50d3fc2f85398811f669061b05adb987299e4410bfa7ffec8a3df487c5eb"} Apr 20 07:03:59.255857 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:59.255816 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tfcdm" Apr 20 07:03:59.335821 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:03:59.335766 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fscsr" podStartSLOduration=6.679737604 podStartE2EDuration="9.335752135s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:55.21419129 +0000 UTC m=+77.857500418" lastFinishedPulling="2026-04-20 07:03:57.870205808 +0000 UTC m=+80.513514949" observedRunningTime="2026-04-20 07:03:58.346317149 +0000 UTC m=+80.989626298" watchObservedRunningTime="2026-04-20 07:03:59.335752135 +0000 UTC m=+81.979061284" Apr 20 07:04:01.164627 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:01.164589 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:01.165195 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:01.164639 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:01.170008 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:01.169987 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:01.322213 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:01.322181 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:02.777201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.777159 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ph9n4"] Apr 20 07:04:02.781930 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.781902 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.789728 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.789699 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:04:02.789892 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.789816 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:04:02.789987 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.789966 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:04:02.791504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.791484 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f8v22\"" Apr 20 07:04:02.792722 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.792705 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:04:02.904483 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904441 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-metrics-client-ca\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904683 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-accelerators-collector-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904683 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904530 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-sys\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904683 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904611 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-tls\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904683 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-textfile\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904885 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904696 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-wtmp\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904885 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904771 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-root\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.904885 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:02.905019 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:02.904918 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7d8\" (UniqueName: \"kubernetes.io/projected/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-kube-api-access-fr7d8\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.006819 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-tls\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.006819 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-textfile\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-wtmp\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006870 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-root\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-root\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.006941 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7d8\" (UniqueName: \"kubernetes.io/projected/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-kube-api-access-fr7d8\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-metrics-client-ca\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-wtmp\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007388 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-accelerators-collector-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007388 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-sys\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007388 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007272 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-sys\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007623 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007601 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-textfile\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007837 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007813 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-accelerators-collector-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.007942 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.007911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-metrics-client-ca\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.009947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.009922 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-tls\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.010089 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.009964 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.037035 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.036954 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7d8\" (UniqueName: \"kubernetes.io/projected/83b47ba2-db8b-4e97-a3b0-1722dcd7d468-kube-api-access-fr7d8\") pod \"node-exporter-ph9n4\" (UID: \"83b47ba2-db8b-4e97-a3b0-1722dcd7d468\") " pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:03.094845 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:03.094809 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ph9n4" Apr 20 07:04:04.350343 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:04.350304 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:04:06.238946 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:06.238918 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:04:08.986879 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:04:08.986842 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b47ba2_db8b_4e97_a3b0_1722dcd7d468.slice/crio-40ed374d2e253d967734d00013e6908fc0c90b1f25be195c1e9fe0256aa37ec7 WatchSource:0}: Error finding container 40ed374d2e253d967734d00013e6908fc0c90b1f25be195c1e9fe0256aa37ec7: Status 404 returned error can't find the container with id 40ed374d2e253d967734d00013e6908fc0c90b1f25be195c1e9fe0256aa37ec7 Apr 20 07:04:09.343790 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:09.343720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ph9n4" event={"ID":"83b47ba2-db8b-4e97-a3b0-1722dcd7d468","Type":"ContainerStarted","Data":"40ed374d2e253d967734d00013e6908fc0c90b1f25be195c1e9fe0256aa37ec7"} Apr 20 07:04:09.345329 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:09.345270 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-mwk27" event={"ID":"7cc62a1d-19e0-48f5-b52d-9344c373a4eb","Type":"ContainerStarted","Data":"6732bf2a129d2f25784928616f3401307e2f9b08e2be8931fd800d1210a47676"} Apr 20 07:04:09.347901 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:09.347847 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:04:09.359246 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:09.359204 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-mwk27" Apr 20 07:04:09.390419 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:09.389346 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-mwk27" podStartSLOduration=1.124050677 podStartE2EDuration="19.38932944s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:50.805825179 +0000 UTC m=+73.449134322" lastFinishedPulling="2026-04-20 07:04:09.071103944 +0000 UTC m=+91.714413085" observedRunningTime="2026-04-20 07:04:09.386949388 +0000 UTC m=+92.030258541" watchObservedRunningTime="2026-04-20 07:04:09.38932944 +0000 UTC m=+92.032638591" Apr 20 07:04:10.350796 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:10.350755 2566 generic.go:358] "Generic (PLEG): container finished" podID="83b47ba2-db8b-4e97-a3b0-1722dcd7d468" containerID="03fb16d9af35b2633de9a30ff368b7ea51314d32120d5ff7667dc5af6dfeee34" exitCode=0 Apr 20 07:04:10.351757 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:10.351733 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ph9n4" event={"ID":"83b47ba2-db8b-4e97-a3b0-1722dcd7d468","Type":"ContainerDied","Data":"03fb16d9af35b2633de9a30ff368b7ea51314d32120d5ff7667dc5af6dfeee34"} Apr 20 07:04:11.356146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:11.356096 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ph9n4" event={"ID":"83b47ba2-db8b-4e97-a3b0-1722dcd7d468","Type":"ContainerStarted","Data":"f483516497ec27f0bdc231c86a89eb694366821a44d228e23d0bb1f3d571c6c9"} Apr 20 07:04:11.356146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:11.356151 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ph9n4" event={"ID":"83b47ba2-db8b-4e97-a3b0-1722dcd7d468","Type":"ContainerStarted","Data":"60b0938effa1bce591bfa92394b78d5a7b1738a89fe060f4e1932f557f4d4177"} Apr 20 07:04:14.081951 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:14.081894 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ph9n4" podStartSLOduration=11.34177486 podStartE2EDuration="12.081875177s" podCreationTimestamp="2026-04-20 07:04:02 +0000 UTC" firstStartedPulling="2026-04-20 07:04:08.989005796 +0000 UTC m=+91.632314924" lastFinishedPulling="2026-04-20 07:04:09.729106109 +0000 UTC m=+92.372415241" observedRunningTime="2026-04-20 07:04:11.415241038 +0000 UTC m=+94.058550188" watchObservedRunningTime="2026-04-20 07:04:14.081875177 +0000 UTC m=+96.725184326" Apr 20 07:04:14.083700 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:14.083673 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:04:15.223039 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:15.223006 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7cnrw" Apr 20 07:04:29.377645 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.377581 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66865cf58f-tpps7" podUID="9a192bbd-08cd-46e9-a6b0-103feb11b910" containerName="console" containerID="cri-o://99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7" gracePeriod=15 Apr 20 07:04:29.644944 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.644918 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66865cf58f-tpps7_9a192bbd-08cd-46e9-a6b0-103feb11b910/console/0.log" Apr 20 07:04:29.645149 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.645002 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:29.734104 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734039 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734136 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734200 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734383 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734342 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734427 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734417 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqm7t\" (UniqueName: \"kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734460 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert\") pod \"9a192bbd-08cd-46e9-a6b0-103feb11b910\" (UID: \"9a192bbd-08cd-46e9-a6b0-103feb11b910\") " Apr 20 07:04:29.734673 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734645 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:29.734673 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734651 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca" (OuterVolumeSpecName: "service-ca") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:29.734819 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.734714 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config" (OuterVolumeSpecName: "console-config") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:29.736785 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.736737 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:29.736785 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.736763 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:29.736785 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.736736 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t" (OuterVolumeSpecName: "kube-api-access-rqm7t") pod "9a192bbd-08cd-46e9-a6b0-103feb11b910" (UID: "9a192bbd-08cd-46e9-a6b0-103feb11b910"). InnerVolumeSpecName "kube-api-access-rqm7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:29.835414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835379 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-oauth-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:29.835414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835410 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-oauth-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:29.835414 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835421 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-service-ca\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:29.835661 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835430 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:29.835661 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835441 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqm7t\" (UniqueName: \"kubernetes.io/projected/9a192bbd-08cd-46e9-a6b0-103feb11b910-kube-api-access-rqm7t\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:29.835661 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:29.835449 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a192bbd-08cd-46e9-a6b0-103feb11b910-console-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:30.412796 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412766 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66865cf58f-tpps7_9a192bbd-08cd-46e9-a6b0-103feb11b910/console/0.log" Apr 20 07:04:30.413362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412807 2566 generic.go:358] "Generic (PLEG): container finished" podID="9a192bbd-08cd-46e9-a6b0-103feb11b910" containerID="99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7" exitCode=2 Apr 20 07:04:30.413362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412872 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66865cf58f-tpps7" event={"ID":"9a192bbd-08cd-46e9-a6b0-103feb11b910","Type":"ContainerDied","Data":"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7"} Apr 20 07:04:30.413362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412876 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66865cf58f-tpps7" Apr 20 07:04:30.413362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412902 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66865cf58f-tpps7" event={"ID":"9a192bbd-08cd-46e9-a6b0-103feb11b910","Type":"ContainerDied","Data":"872d922d928395fe956d40e79de3a2d30b794f45f82198fce9eb642f303e3484"} Apr 20 07:04:30.413362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.412931 2566 scope.go:117] "RemoveContainer" containerID="99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7" Apr 20 07:04:30.421046 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.421027 2566 scope.go:117] "RemoveContainer" containerID="99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7" Apr 20 07:04:30.421322 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:04:30.421301 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7\": container with ID starting with 99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7 not found: ID does not exist" containerID="99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7" Apr 20 07:04:30.421368 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.421331 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7"} err="failed to get container status \"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7\": rpc error: code = NotFound desc = could not find container \"99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7\": container with ID starting with 99e6395e1174c70758b4d637b5934bba480a2cbd65a7ea567b039c2045b3b8b7 not found: ID does not exist" Apr 20 07:04:30.432561 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.432534 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:04:30.442299 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:30.442274 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66865cf58f-tpps7"] Apr 20 07:04:31.422104 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:31.421985 2566 generic.go:358] "Generic (PLEG): container finished" podID="de3d8741-a673-4d3c-9bd2-323788b79b5a" containerID="a566252125bfb0086ec3f16ee828a58c8510288d0b33880a4a254fc7b03796cf" exitCode=0 Apr 20 07:04:31.422667 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:31.422639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" event={"ID":"de3d8741-a673-4d3c-9bd2-323788b79b5a","Type":"ContainerDied","Data":"a566252125bfb0086ec3f16ee828a58c8510288d0b33880a4a254fc7b03796cf"} Apr 20 07:04:31.423253 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:31.423226 2566 scope.go:117] "RemoveContainer" containerID="a566252125bfb0086ec3f16ee828a58c8510288d0b33880a4a254fc7b03796cf" Apr 20 07:04:31.939907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:31.939869 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a192bbd-08cd-46e9-a6b0-103feb11b910" path="/var/lib/kubelet/pods/9a192bbd-08cd-46e9-a6b0-103feb11b910/volumes" Apr 20 07:04:32.429382 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:32.429344 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ttgx9" event={"ID":"de3d8741-a673-4d3c-9bd2-323788b79b5a","Type":"ContainerStarted","Data":"74b287a7532c20a0a5bab3afcf3304c2079f24732ef4a3bd3ba86a15c39070d8"} Apr 20 07:04:39.105820 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.105753 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747586d489-dx7j9" podUID="42020703-9cbf-4ede-8527-45cfaca47cf6" containerName="registry" containerID="cri-o://f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e" gracePeriod=30 Apr 20 07:04:39.368268 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.368238 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:04:39.413904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.413860 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.413919 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.413968 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414000 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414026 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414054 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414050 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zm6q\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414107 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414148 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") pod \"42020703-9cbf-4ede-8527-45cfaca47cf6\" (UID: \"42020703-9cbf-4ede-8527-45cfaca47cf6\") " Apr 20 07:04:39.414484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414451 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:39.414547 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.414525 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:39.416835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.416795 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:39.417269 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.417244 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:39.417269 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.417255 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q" (OuterVolumeSpecName: "kube-api-access-7zm6q") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "kube-api-access-7zm6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:39.417418 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.417379 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:39.417558 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.417533 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:39.424826 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.424801 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "42020703-9cbf-4ede-8527-45cfaca47cf6" (UID: "42020703-9cbf-4ede-8527-45cfaca47cf6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:04:39.450139 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.450099 2566 generic.go:358] "Generic (PLEG): container finished" podID="42020703-9cbf-4ede-8527-45cfaca47cf6" containerID="f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e" exitCode=0 Apr 20 07:04:39.450311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.450192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747586d489-dx7j9" event={"ID":"42020703-9cbf-4ede-8527-45cfaca47cf6","Type":"ContainerDied","Data":"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e"} Apr 20 07:04:39.450311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.450230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747586d489-dx7j9" event={"ID":"42020703-9cbf-4ede-8527-45cfaca47cf6","Type":"ContainerDied","Data":"2351609407b3e9467baacf8750b896aebcc707284e8b5e5b764d77cdf0e0e89a"} Apr 20 07:04:39.450311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.450249 2566 scope.go:117] "RemoveContainer" containerID="f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e" Apr 20 07:04:39.450311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.450198 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747586d489-dx7j9" Apr 20 07:04:39.458861 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.458834 2566 scope.go:117] "RemoveContainer" containerID="f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e" Apr 20 07:04:39.459229 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:04:39.459206 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e\": container with ID starting with f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e not found: ID does not exist" containerID="f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e" Apr 20 07:04:39.459302 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.459239 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e"} err="failed to get container status \"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e\": rpc error: code = NotFound desc = could not find container \"f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e\": container with ID starting with f39f9ea81583aea12b9880a8356057198a27749f41b3d8ee7b14133c66e95b2e not found: ID does not exist" Apr 20 07:04:39.511573 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.511541 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:04:39.515730 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515697 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-bound-sa-token\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515734 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-certificates\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515753 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-installation-pull-secrets\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515758 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-747586d489-dx7j9"] Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515768 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42020703-9cbf-4ede-8527-45cfaca47cf6-ca-trust-extracted\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515784 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42020703-9cbf-4ede-8527-45cfaca47cf6-trusted-ca\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515798 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zm6q\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-kube-api-access-7zm6q\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515813 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/42020703-9cbf-4ede-8527-45cfaca47cf6-image-registry-private-configuration\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.515877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.515826 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42020703-9cbf-4ede-8527-45cfaca47cf6-registry-tls\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:04:39.939513 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:39.939478 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42020703-9cbf-4ede-8527-45cfaca47cf6" path="/var/lib/kubelet/pods/42020703-9cbf-4ede-8527-45cfaca47cf6/volumes" Apr 20 07:04:46.470258 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:46.470222 2566 generic.go:358] "Generic (PLEG): container finished" podID="31731af1-46d3-416a-98dc-3db15ceaab73" containerID="13fe7a614cba52118a56acbe8d22c60353e92444603772930ddb5d3ae479ac46" exitCode=0 Apr 20 07:04:46.470676 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:46.470291 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" event={"ID":"31731af1-46d3-416a-98dc-3db15ceaab73","Type":"ContainerDied","Data":"13fe7a614cba52118a56acbe8d22c60353e92444603772930ddb5d3ae479ac46"} Apr 20 07:04:46.470676 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:46.470606 2566 scope.go:117] "RemoveContainer" containerID="13fe7a614cba52118a56acbe8d22c60353e92444603772930ddb5d3ae479ac46" Apr 20 07:04:47.474853 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:47.474817 2566 generic.go:358] "Generic (PLEG): container finished" podID="84355a5f-ecc3-4e60-b0d3-c5dc15f059c6" containerID="2ce5a1bfffcc2b7441ec37c2d1f0a2f36d2a9abafa3319679ddb751525fca817" exitCode=0 Apr 20 07:04:47.475310 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:47.474898 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" event={"ID":"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6","Type":"ContainerDied","Data":"2ce5a1bfffcc2b7441ec37c2d1f0a2f36d2a9abafa3319679ddb751525fca817"} Apr 20 07:04:47.475380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:47.475305 2566 scope.go:117] "RemoveContainer" containerID="2ce5a1bfffcc2b7441ec37c2d1f0a2f36d2a9abafa3319679ddb751525fca817" Apr 20 07:04:47.476814 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:47.476793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jlbv6" event={"ID":"31731af1-46d3-416a-98dc-3db15ceaab73","Type":"ContainerStarted","Data":"f35ad1e3bcb0af45f9760e4b55e055df7c69c76ba8b1f82909b56fbb556d6ffe"} Apr 20 07:04:48.481003 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:04:48.480966 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhwn4" event={"ID":"84355a5f-ecc3-4e60-b0d3-c5dc15f059c6","Type":"ContainerStarted","Data":"3bcb1a1fb9019f462461601b352ab5456114b92db4327ac5ed92a34a76b1d33c"} Apr 20 07:07:37.877545 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:07:37.877517 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:07:37.877545 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:07:37.877548 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:07:37.882091 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:07:37.882048 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:07:37.882247 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:07:37.882165 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:07:37.887988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:07:37.887965 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:09:17.390090 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390038 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390456 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42020703-9cbf-4ede-8527-45cfaca47cf6" containerName="registry" Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390471 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="42020703-9cbf-4ede-8527-45cfaca47cf6" containerName="registry" Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390480 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a192bbd-08cd-46e9-a6b0-103feb11b910" containerName="console" Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390486 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a192bbd-08cd-46e9-a6b0-103feb11b910" containerName="console" Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390547 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a192bbd-08cd-46e9-a6b0-103feb11b910" containerName="console" Apr 20 07:09:17.390612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.390558 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="42020703-9cbf-4ede-8527-45cfaca47cf6" containerName="registry" Apr 20 07:09:17.393981 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.393955 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.397093 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.397037 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 07:09:17.397246 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.397040 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 07:09:17.398151 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.398129 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 07:09:17.398457 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.398432 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 07:09:17.398571 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.398439 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 07:09:17.398571 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.398446 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-n4xjx\"" Apr 20 07:09:17.402795 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.402762 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 07:09:17.405395 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.405368 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:09:17.499292 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499247 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499292 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499292 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499316 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499449 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499478 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.499540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.499498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kzt\" (UniqueName: \"kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600206 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600206 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600229 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600404 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.600460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.600458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9kzt\" (UniqueName: \"kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.601229 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.601202 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.601346 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.601328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.601408 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.601385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.601583 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.601559 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.602912 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.602891 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.603108 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.603089 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.611499 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.611469 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9kzt\" (UniqueName: \"kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt\") pod \"console-6bccc5cd7b-qlpv4\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.706919 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.706823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:17.836497 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.836457 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:09:17.839266 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:09:17.839231 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa7fdfa9_eb3b_42d4_a619_732f944c63af.slice/crio-9fb3770b8ddde2742e400d07b067ec8932f9b68b30f4de4704ab7912007ad768 WatchSource:0}: Error finding container 9fb3770b8ddde2742e400d07b067ec8932f9b68b30f4de4704ab7912007ad768: Status 404 returned error can't find the container with id 9fb3770b8ddde2742e400d07b067ec8932f9b68b30f4de4704ab7912007ad768 Apr 20 07:09:17.841002 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:17.840985 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:09:18.225921 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:18.225882 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bccc5cd7b-qlpv4" event={"ID":"aa7fdfa9-eb3b-42d4-a619-732f944c63af","Type":"ContainerStarted","Data":"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c"} Apr 20 07:09:18.225921 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:18.225923 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bccc5cd7b-qlpv4" event={"ID":"aa7fdfa9-eb3b-42d4-a619-732f944c63af","Type":"ContainerStarted","Data":"9fb3770b8ddde2742e400d07b067ec8932f9b68b30f4de4704ab7912007ad768"} Apr 20 07:09:18.250257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:18.250199 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bccc5cd7b-qlpv4" podStartSLOduration=1.2501815569999999 podStartE2EDuration="1.250181557s" podCreationTimestamp="2026-04-20 07:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:09:18.248808895 +0000 UTC m=+400.892118060" watchObservedRunningTime="2026-04-20 07:09:18.250181557 +0000 UTC m=+400.893490706" Apr 20 07:09:27.707392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:27.707351 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:27.707392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:27.707392 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:27.712160 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:27.712134 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:09:28.261740 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:09:28.261705 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:10:29.314589 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.314544 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:10:29.317954 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.317927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.335044 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.335006 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:10:29.362235 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362280 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tx6g\" (UniqueName: \"kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362378 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362410 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362428 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362431 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.362677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.362480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463387 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463346 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463402 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463441 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463486 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463512 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.463580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.463549 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tx6g\" (UniqueName: \"kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.464174 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.464149 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.464330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.464307 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.464369 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.464341 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.464848 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.464828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.466270 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.466235 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.466449 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.466430 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.489337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.489303 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tx6g\" (UniqueName: \"kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g\") pod \"console-d8f479686-jmvqh\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.630131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.630006 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:29.767007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:29.766982 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:10:29.769428 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:10:29.769395 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43716734_2ee4_4d8d_a59f_7acea7d37be7.slice/crio-fb20eb047760424977c53976d5a20293f61b9cdb6dd91e993e99bcb4ad925ddc WatchSource:0}: Error finding container fb20eb047760424977c53976d5a20293f61b9cdb6dd91e993e99bcb4ad925ddc: Status 404 returned error can't find the container with id fb20eb047760424977c53976d5a20293f61b9cdb6dd91e993e99bcb4ad925ddc Apr 20 07:10:30.434257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:30.434218 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8f479686-jmvqh" event={"ID":"43716734-2ee4-4d8d-a59f-7acea7d37be7","Type":"ContainerStarted","Data":"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11"} Apr 20 07:10:30.434257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:30.434259 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8f479686-jmvqh" event={"ID":"43716734-2ee4-4d8d-a59f-7acea7d37be7","Type":"ContainerStarted","Data":"fb20eb047760424977c53976d5a20293f61b9cdb6dd91e993e99bcb4ad925ddc"} Apr 20 07:10:30.455501 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:30.455448 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d8f479686-jmvqh" podStartSLOduration=1.455433088 podStartE2EDuration="1.455433088s" podCreationTimestamp="2026-04-20 07:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:10:30.454506832 +0000 UTC m=+473.097815982" watchObservedRunningTime="2026-04-20 07:10:30.455433088 +0000 UTC m=+473.098742239" Apr 20 07:10:39.630287 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:39.630245 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:39.630754 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:39.630413 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:39.635336 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:39.635311 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:40.470085 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:40.470028 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:10:40.525995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:10:40.525956 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:11:05.550091 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.550020 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bccc5cd7b-qlpv4" podUID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" containerName="console" containerID="cri-o://1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c" gracePeriod=15 Apr 20 07:11:05.787932 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.787909 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bccc5cd7b-qlpv4_aa7fdfa9-eb3b-42d4-a619-732f944c63af/console/0.log" Apr 20 07:11:05.788099 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.787970 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:11:05.863303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863210 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863273 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863303 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863292 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9kzt\" (UniqueName: \"kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863323 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863368 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863452 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863506 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca\") pod \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\" (UID: \"aa7fdfa9-eb3b-42d4-a619-732f944c63af\") " Apr 20 07:11:05.863807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863780 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:11:05.863871 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863841 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:11:05.863966 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.863938 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config" (OuterVolumeSpecName: "console-config") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:11:05.864017 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.864000 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:11:05.865839 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.865808 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:11:05.866142 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.866119 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:11:05.866215 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.866156 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt" (OuterVolumeSpecName: "kube-api-access-j9kzt") pod "aa7fdfa9-eb3b-42d4-a619-732f944c63af" (UID: "aa7fdfa9-eb3b-42d4-a619-732f944c63af"). InnerVolumeSpecName "kube-api-access-j9kzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:11:05.964281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964238 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-oauth-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964273 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9kzt\" (UniqueName: \"kubernetes.io/projected/aa7fdfa9-eb3b-42d4-a619-732f944c63af-kube-api-access-j9kzt\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964284 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964292 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-trusted-ca-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964302 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7fdfa9-eb3b-42d4-a619-732f944c63af-console-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964311 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-service-ca\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:05.964540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:05.964319 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7fdfa9-eb3b-42d4-a619-732f944c63af-oauth-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:06.541741 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541712 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bccc5cd7b-qlpv4_aa7fdfa9-eb3b-42d4-a619-732f944c63af/console/0.log" Apr 20 07:11:06.541907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541753 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" containerID="1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c" exitCode=2 Apr 20 07:11:06.541907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541789 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bccc5cd7b-qlpv4" event={"ID":"aa7fdfa9-eb3b-42d4-a619-732f944c63af","Type":"ContainerDied","Data":"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c"} Apr 20 07:11:06.541907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bccc5cd7b-qlpv4" event={"ID":"aa7fdfa9-eb3b-42d4-a619-732f944c63af","Type":"ContainerDied","Data":"9fb3770b8ddde2742e400d07b067ec8932f9b68b30f4de4704ab7912007ad768"} Apr 20 07:11:06.541907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541823 2566 scope.go:117] "RemoveContainer" containerID="1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c" Apr 20 07:11:06.541907 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.541821 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bccc5cd7b-qlpv4" Apr 20 07:11:06.550106 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.550079 2566 scope.go:117] "RemoveContainer" containerID="1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c" Apr 20 07:11:06.550456 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:11:06.550379 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c\": container with ID starting with 1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c not found: ID does not exist" containerID="1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c" Apr 20 07:11:06.550456 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.550407 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c"} err="failed to get container status \"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c\": rpc error: code = NotFound desc = could not find container \"1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c\": container with ID starting with 1f7658e6b6a02b1751fd285489205226cc739b3822af91770977afea4d128d0c not found: ID does not exist" Apr 20 07:11:06.564301 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.564272 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:11:06.574199 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:06.574170 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bccc5cd7b-qlpv4"] Apr 20 07:11:07.943678 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:07.943640 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" path="/var/lib/kubelet/pods/aa7fdfa9-eb3b-42d4-a619-732f944c63af/volumes" Apr 20 07:11:31.428010 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.427925 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2"] Apr 20 07:11:31.428375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.428227 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" containerName="console" Apr 20 07:11:31.428375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.428237 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" containerName="console" Apr 20 07:11:31.428375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.428295 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa7fdfa9-eb3b-42d4-a619-732f944c63af" containerName="console" Apr 20 07:11:31.431011 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.430993 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.434801 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.434776 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:11:31.434964 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.434947 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:11:31.435838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.435824 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:11:31.444424 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.444404 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2"] Apr 20 07:11:31.466834 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.466805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.466969 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.466886 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.466969 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.466911 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbltw\" (UniqueName: \"kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.567973 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.567939 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.568157 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.568026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.568157 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.568050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbltw\" (UniqueName: \"kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.568363 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.568343 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.568412 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.568391 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.578756 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.578726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbltw\" (UniqueName: \"kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.739997 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.739895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:31.865977 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:31.865933 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2"] Apr 20 07:11:31.870129 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:11:31.870087 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdea5ccc_0aa3_4367_8d73_a331c70b3e3c.slice/crio-4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19 WatchSource:0}: Error finding container 4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19: Status 404 returned error can't find the container with id 4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19 Apr 20 07:11:32.619967 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:32.619922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" event={"ID":"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c","Type":"ContainerStarted","Data":"4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19"} Apr 20 07:11:39.640572 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:39.640532 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerID="0f8ee9e5d16f28bc65934f89ce5c29edd2ec526147754ecba8abd190516de10e" exitCode=0 Apr 20 07:11:39.641038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:39.640621 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" event={"ID":"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c","Type":"ContainerDied","Data":"0f8ee9e5d16f28bc65934f89ce5c29edd2ec526147754ecba8abd190516de10e"} Apr 20 07:11:41.647688 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:41.647653 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerID="7a1037521e7ee710ef6b72a13fabfee0a22f99ae997eb73ff398926bb0f85c2f" exitCode=0 Apr 20 07:11:41.648100 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:41.647739 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" event={"ID":"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c","Type":"ContainerDied","Data":"7a1037521e7ee710ef6b72a13fabfee0a22f99ae997eb73ff398926bb0f85c2f"} Apr 20 07:11:49.674976 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:49.674884 2566 generic.go:358] "Generic (PLEG): container finished" podID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerID="80dcf3e874a9b3150a01b280d7ffb9d51866ac281bab2b621a8ddc71b9b7cead" exitCode=0 Apr 20 07:11:49.675383 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:49.674975 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" event={"ID":"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c","Type":"ContainerDied","Data":"80dcf3e874a9b3150a01b280d7ffb9d51866ac281bab2b621a8ddc71b9b7cead"} Apr 20 07:11:50.797681 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.797654 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:50.935266 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.935181 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle\") pod \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " Apr 20 07:11:50.935266 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.935243 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util\") pod \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " Apr 20 07:11:50.935266 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.935266 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbltw\" (UniqueName: \"kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw\") pod \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\" (UID: \"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c\") " Apr 20 07:11:50.935738 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.935701 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle" (OuterVolumeSpecName: "bundle") pod "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" (UID: "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:11:50.937661 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.937634 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw" (OuterVolumeSpecName: "kube-api-access-nbltw") pod "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" (UID: "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c"). InnerVolumeSpecName "kube-api-access-nbltw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:11:50.939904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:50.939882 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util" (OuterVolumeSpecName: "util") pod "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" (UID: "bdea5ccc-0aa3-4367-8d73-a331c70b3e3c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:11:51.036555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.036514 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:51.036555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.036546 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:51.036555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.036558 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbltw\" (UniqueName: \"kubernetes.io/projected/bdea5ccc-0aa3-4367-8d73-a331c70b3e3c-kube-api-access-nbltw\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:11:51.681567 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.681530 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" event={"ID":"bdea5ccc-0aa3-4367-8d73-a331c70b3e3c","Type":"ContainerDied","Data":"4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19"} Apr 20 07:11:51.681567 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.681558 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rnmh2" Apr 20 07:11:51.681773 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:11:51.681561 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4efde427ccf43deba975610473c94533a6455f3fa514596da5297ad013b38b19" Apr 20 07:12:02.439199 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439163 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5"] Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439442 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="util" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439453 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="util" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439461 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="pull" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439467 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="pull" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439484 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="extract" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439490 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="extract" Apr 20 07:12:02.439578 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.439552 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdea5ccc-0aa3-4367-8d73-a331c70b3e3c" containerName="extract" Apr 20 07:12:02.442521 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.442501 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.445531 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.445506 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-j4hxq\"" Apr 20 07:12:02.445668 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.445646 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 07:12:02.445837 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.445815 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:12:02.454163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.454136 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5"] Apr 20 07:12:02.527707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.527667 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/373262c1-708e-4e1e-9b4f-4541fe5a3e52-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.527707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.527706 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khqmz\" (UniqueName: \"kubernetes.io/projected/373262c1-708e-4e1e-9b4f-4541fe5a3e52-kube-api-access-khqmz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.629004 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.628960 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/373262c1-708e-4e1e-9b4f-4541fe5a3e52-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.629004 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.629007 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khqmz\" (UniqueName: \"kubernetes.io/projected/373262c1-708e-4e1e-9b4f-4541fe5a3e52-kube-api-access-khqmz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.629386 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.629361 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/373262c1-708e-4e1e-9b4f-4541fe5a3e52-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.639505 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.639462 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khqmz\" (UniqueName: \"kubernetes.io/projected/373262c1-708e-4e1e-9b4f-4541fe5a3e52-kube-api-access-khqmz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-lvsm5\" (UID: \"373262c1-708e-4e1e-9b4f-4541fe5a3e52\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.752571 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.752540 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" Apr 20 07:12:02.882986 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:02.882952 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5"] Apr 20 07:12:02.885961 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:02.885929 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod373262c1_708e_4e1e_9b4f_4541fe5a3e52.slice/crio-062bd363af2035d8e517ac5368a3db285bed9ca9a54efa091008a8c311f89ff5 WatchSource:0}: Error finding container 062bd363af2035d8e517ac5368a3db285bed9ca9a54efa091008a8c311f89ff5: Status 404 returned error can't find the container with id 062bd363af2035d8e517ac5368a3db285bed9ca9a54efa091008a8c311f89ff5 Apr 20 07:12:03.718927 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:03.718889 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" event={"ID":"373262c1-708e-4e1e-9b4f-4541fe5a3e52","Type":"ContainerStarted","Data":"062bd363af2035d8e517ac5368a3db285bed9ca9a54efa091008a8c311f89ff5"} Apr 20 07:12:04.723795 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:04.723757 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" event={"ID":"373262c1-708e-4e1e-9b4f-4541fe5a3e52","Type":"ContainerStarted","Data":"25b8b081c76b2d6235fd2e28ad7280407ad6b44426fe07af62ba0a20605bccd9"} Apr 20 07:12:04.746397 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:04.746337 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-lvsm5" podStartSLOduration=1.10114234 podStartE2EDuration="2.746318442s" podCreationTimestamp="2026-04-20 07:12:02 +0000 UTC" firstStartedPulling="2026-04-20 07:12:02.888357773 +0000 UTC m=+565.531666901" lastFinishedPulling="2026-04-20 07:12:04.533533871 +0000 UTC m=+567.176843003" observedRunningTime="2026-04-20 07:12:04.744695336 +0000 UTC m=+567.388004500" watchObservedRunningTime="2026-04-20 07:12:04.746318442 +0000 UTC m=+567.389627594" Apr 20 07:12:10.791014 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.790975 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5mh2t"] Apr 20 07:12:10.794380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.794362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:10.797726 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.797704 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 07:12:10.797841 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.797710 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-9mw64\"" Apr 20 07:12:10.798177 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.798158 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 07:12:10.802007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.801985 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5mh2t"] Apr 20 07:12:10.901762 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.901725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:10.901762 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:10.901767 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpqb\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-kube-api-access-7bpqb\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.002764 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.002721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.002764 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.002766 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpqb\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-kube-api-access-7bpqb\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.013456 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.013429 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.013618 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.013597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpqb\" (UniqueName: \"kubernetes.io/projected/efaa92fe-1947-4881-914a-fe60fcf36167-kube-api-access-7bpqb\") pod \"cert-manager-cainjector-8966b78d4-5mh2t\" (UID: \"efaa92fe-1947-4881-914a-fe60fcf36167\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.103709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.103625 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" Apr 20 07:12:11.251135 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.251101 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5mh2t"] Apr 20 07:12:11.254075 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:11.254030 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefaa92fe_1947_4881_914a_fe60fcf36167.slice/crio-c979c174606ea6f3e0cc2dcc4cbc1408e30dfeb85b5a0fa5919d9b50a25abfd6 WatchSource:0}: Error finding container c979c174606ea6f3e0cc2dcc4cbc1408e30dfeb85b5a0fa5919d9b50a25abfd6: Status 404 returned error can't find the container with id c979c174606ea6f3e0cc2dcc4cbc1408e30dfeb85b5a0fa5919d9b50a25abfd6 Apr 20 07:12:11.746707 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.746668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" event={"ID":"efaa92fe-1947-4881-914a-fe60fcf36167","Type":"ContainerStarted","Data":"c979c174606ea6f3e0cc2dcc4cbc1408e30dfeb85b5a0fa5919d9b50a25abfd6"} Apr 20 07:12:11.901833 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.901793 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q"] Apr 20 07:12:11.905396 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.905372 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:11.908233 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.908202 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:12:11.909260 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.909201 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:12:11.909260 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.909222 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:12:11.913830 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:11.913804 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q"] Apr 20 07:12:12.010725 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.010688 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.010725 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.010726 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdc5h\" (UniqueName: \"kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.010954 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.010882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.111461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.111414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.111461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.111467 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.111461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.111494 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdc5h\" (UniqueName: \"kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.111895 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.111874 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.111968 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.111909 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.119924 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.119898 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdc5h\" (UniqueName: \"kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.217864 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.217830 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:12.370163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.370130 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q"] Apr 20 07:12:12.372809 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:12.372777 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd329b384_f9ae_40c5_a7a6_0cbd176e613d.slice/crio-f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018 WatchSource:0}: Error finding container f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018: Status 404 returned error can't find the container with id f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018 Apr 20 07:12:12.751739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.751701 2566 generic.go:358] "Generic (PLEG): container finished" podID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerID="6348de2fdc9a38d3b18afafeb290e8fef15ae88acc91082e3f9bf5a77ec4c6fa" exitCode=0 Apr 20 07:12:12.751931 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.751743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" event={"ID":"d329b384-f9ae-40c5-a7a6-0cbd176e613d","Type":"ContainerDied","Data":"6348de2fdc9a38d3b18afafeb290e8fef15ae88acc91082e3f9bf5a77ec4c6fa"} Apr 20 07:12:12.751931 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:12.751783 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" event={"ID":"d329b384-f9ae-40c5-a7a6-0cbd176e613d","Type":"ContainerStarted","Data":"f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018"} Apr 20 07:12:14.760652 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:14.760613 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" event={"ID":"efaa92fe-1947-4881-914a-fe60fcf36167","Type":"ContainerStarted","Data":"8e5f0dde5e2ca47b3f537f15e7c6dc84439a05a6224947a064974b1e2b98c32f"} Apr 20 07:12:14.778838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:14.778786 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-5mh2t" podStartSLOduration=2.1949503 podStartE2EDuration="4.77877246s" podCreationTimestamp="2026-04-20 07:12:10 +0000 UTC" firstStartedPulling="2026-04-20 07:12:11.25599229 +0000 UTC m=+573.899301417" lastFinishedPulling="2026-04-20 07:12:13.839814449 +0000 UTC m=+576.483123577" observedRunningTime="2026-04-20 07:12:14.776380519 +0000 UTC m=+577.419689669" watchObservedRunningTime="2026-04-20 07:12:14.77877246 +0000 UTC m=+577.422081611" Apr 20 07:12:15.764913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:15.764869 2566 generic.go:358] "Generic (PLEG): container finished" podID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerID="f3e3cab161e299111adcfb0a3aa027630d38ce076820947e843d9db4add8fd82" exitCode=0 Apr 20 07:12:15.765281 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:15.764956 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" event={"ID":"d329b384-f9ae-40c5-a7a6-0cbd176e613d","Type":"ContainerDied","Data":"f3e3cab161e299111adcfb0a3aa027630d38ce076820947e843d9db4add8fd82"} Apr 20 07:12:16.769871 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:16.769836 2566 generic.go:358] "Generic (PLEG): container finished" podID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerID="c646c69db216e7d902166cd4708cbf4812f2ee8a84d9238bc462bbbe809ef9f4" exitCode=0 Apr 20 07:12:16.770248 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:16.769879 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" event={"ID":"d329b384-f9ae-40c5-a7a6-0cbd176e613d","Type":"ContainerDied","Data":"c646c69db216e7d902166cd4708cbf4812f2ee8a84d9238bc462bbbe809ef9f4"} Apr 20 07:12:17.898988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:17.898965 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:18.058809 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.058724 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle\") pod \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " Apr 20 07:12:18.058809 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.058763 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdc5h\" (UniqueName: \"kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h\") pod \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " Apr 20 07:12:18.058809 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.058787 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util\") pod \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\" (UID: \"d329b384-f9ae-40c5-a7a6-0cbd176e613d\") " Apr 20 07:12:18.059166 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.059143 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle" (OuterVolumeSpecName: "bundle") pod "d329b384-f9ae-40c5-a7a6-0cbd176e613d" (UID: "d329b384-f9ae-40c5-a7a6-0cbd176e613d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:18.061035 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.061010 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h" (OuterVolumeSpecName: "kube-api-access-pdc5h") pod "d329b384-f9ae-40c5-a7a6-0cbd176e613d" (UID: "d329b384-f9ae-40c5-a7a6-0cbd176e613d"). InnerVolumeSpecName "kube-api-access-pdc5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:12:18.063024 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.063003 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util" (OuterVolumeSpecName: "util") pod "d329b384-f9ae-40c5-a7a6-0cbd176e613d" (UID: "d329b384-f9ae-40c5-a7a6-0cbd176e613d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:18.159699 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.159660 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:18.159699 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.159695 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdc5h\" (UniqueName: \"kubernetes.io/projected/d329b384-f9ae-40c5-a7a6-0cbd176e613d-kube-api-access-pdc5h\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:18.159699 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.159707 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d329b384-f9ae-40c5-a7a6-0cbd176e613d-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:18.778432 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.778397 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" event={"ID":"d329b384-f9ae-40c5-a7a6-0cbd176e613d","Type":"ContainerDied","Data":"f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018"} Apr 20 07:12:18.778432 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.778437 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f038597dabda7a5628ae3a56b0dfb39e3ce2170566c6a313c46c42f573c45018" Apr 20 07:12:18.778672 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:18.778449 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg5s8q" Apr 20 07:12:29.942261 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942218 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d"] Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942630 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="util" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942648 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="util" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942670 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="extract" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942678 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="extract" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942692 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="pull" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942701 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="pull" Apr 20 07:12:29.942835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.942786 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d329b384-f9ae-40c5-a7a6-0cbd176e613d" containerName="extract" Apr 20 07:12:29.946134 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.946107 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:29.948625 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.948600 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:12:29.949748 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.949727 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:12:29.949831 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.949758 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:12:29.954012 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:29.953974 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d"] Apr 20 07:12:30.044367 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.044335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.044550 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.044419 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.044550 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.044480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k5lg\" (UniqueName: \"kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.145787 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.145747 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k5lg\" (UniqueName: \"kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.145976 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.145804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.145976 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.145856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.146235 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.146213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.146309 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.146290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.156032 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.156004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k5lg\" (UniqueName: \"kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.256452 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.256421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:30.390561 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.390531 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d"] Apr 20 07:12:30.392775 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:30.392748 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc15aa973_2118_47b4_b006_4f1f0d4a1121.slice/crio-b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384 WatchSource:0}: Error finding container b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384: Status 404 returned error can't find the container with id b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384 Apr 20 07:12:30.818540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.818497 2566 generic.go:358] "Generic (PLEG): container finished" podID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerID="2dbff54c1e16e2291367cb12669e323c4ad9f1fea3bc3a328f86f7e1e053670c" exitCode=0 Apr 20 07:12:30.818709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.818579 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" event={"ID":"c15aa973-2118-47b4-b006-4f1f0d4a1121","Type":"ContainerDied","Data":"2dbff54c1e16e2291367cb12669e323c4ad9f1fea3bc3a328f86f7e1e053670c"} Apr 20 07:12:30.818709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:30.818620 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" event={"ID":"c15aa973-2118-47b4-b006-4f1f0d4a1121","Type":"ContainerStarted","Data":"b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384"} Apr 20 07:12:31.823050 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:31.823015 2566 generic.go:358] "Generic (PLEG): container finished" podID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerID="a95ca2262ae088338ac376a6337ae33bc015a7591c5d2e1d1dd86899d0dc6d17" exitCode=0 Apr 20 07:12:31.823451 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:31.823101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" event={"ID":"c15aa973-2118-47b4-b006-4f1f0d4a1121","Type":"ContainerDied","Data":"a95ca2262ae088338ac376a6337ae33bc015a7591c5d2e1d1dd86899d0dc6d17"} Apr 20 07:12:32.828634 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:32.828599 2566 generic.go:358] "Generic (PLEG): container finished" podID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerID="375cf2b313de7e26f8565e5c7e04eadadc4cb8a7472bdf5a345154c579d2d836" exitCode=0 Apr 20 07:12:32.829012 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:32.828688 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" event={"ID":"c15aa973-2118-47b4-b006-4f1f0d4a1121","Type":"ContainerDied","Data":"375cf2b313de7e26f8565e5c7e04eadadc4cb8a7472bdf5a345154c579d2d836"} Apr 20 07:12:33.958228 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.958204 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:33.977027 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.976993 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle\") pod \"c15aa973-2118-47b4-b006-4f1f0d4a1121\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " Apr 20 07:12:33.977218 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.977041 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util\") pod \"c15aa973-2118-47b4-b006-4f1f0d4a1121\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " Apr 20 07:12:33.977218 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.977095 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k5lg\" (UniqueName: \"kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg\") pod \"c15aa973-2118-47b4-b006-4f1f0d4a1121\" (UID: \"c15aa973-2118-47b4-b006-4f1f0d4a1121\") " Apr 20 07:12:33.977938 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.977904 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle" (OuterVolumeSpecName: "bundle") pod "c15aa973-2118-47b4-b006-4f1f0d4a1121" (UID: "c15aa973-2118-47b4-b006-4f1f0d4a1121"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:33.979498 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.979469 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg" (OuterVolumeSpecName: "kube-api-access-8k5lg") pod "c15aa973-2118-47b4-b006-4f1f0d4a1121" (UID: "c15aa973-2118-47b4-b006-4f1f0d4a1121"). InnerVolumeSpecName "kube-api-access-8k5lg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:12:33.983450 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:33.983416 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util" (OuterVolumeSpecName: "util") pod "c15aa973-2118-47b4-b006-4f1f0d4a1121" (UID: "c15aa973-2118-47b4-b006-4f1f0d4a1121"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:34.077986 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.077942 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:34.077986 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.077977 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8k5lg\" (UniqueName: \"kubernetes.io/projected/c15aa973-2118-47b4-b006-4f1f0d4a1121-kube-api-access-8k5lg\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:34.077986 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.077989 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c15aa973-2118-47b4-b006-4f1f0d4a1121-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:34.838359 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.838331 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" Apr 20 07:12:34.838529 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.838329 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dxt7d" event={"ID":"c15aa973-2118-47b4-b006-4f1f0d4a1121","Type":"ContainerDied","Data":"b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384"} Apr 20 07:12:34.838529 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:34.838437 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b302770b7b3d863f34780943b077f1ccc76b01a7509be4fa642d1c5e38d83384" Apr 20 07:12:37.903027 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:37.903000 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:12:37.903484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:37.903374 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:12:37.906630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:37.906608 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:12:37.906950 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:37.906933 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:12:42.401818 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.401766 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f"] Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402076 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="extract" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402090 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="extract" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402102 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="util" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402110 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="util" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402125 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="pull" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402133 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="pull" Apr 20 07:12:42.402242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.402187 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c15aa973-2118-47b4-b006-4f1f0d4a1121" containerName="extract" Apr 20 07:12:42.404676 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.404654 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.407587 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.407569 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:12:42.407709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.407626 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:12:42.407947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.407931 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:12:42.417249 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.417177 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f"] Apr 20 07:12:42.445184 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.445152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqp5z\" (UniqueName: \"kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.445306 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.445190 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.445306 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.445258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.546504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.546449 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqp5z\" (UniqueName: \"kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.546504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.546509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.546718 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.546536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.546845 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.546831 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.546916 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.546898 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.555797 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.555772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqp5z\" (UniqueName: \"kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.714345 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.714257 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:42.849710 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.849679 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f"] Apr 20 07:12:42.852200 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:42.852169 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6369b13c_a138_4ac6_9d0d_d934eecd618d.slice/crio-165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24 WatchSource:0}: Error finding container 165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24: Status 404 returned error can't find the container with id 165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24 Apr 20 07:12:42.873130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:42.873090 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" event={"ID":"6369b13c-a138-4ac6-9d0d-d934eecd618d","Type":"ContainerStarted","Data":"165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24"} Apr 20 07:12:43.879421 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.879321 2566 generic.go:358] "Generic (PLEG): container finished" podID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerID="7e080dcb62f0ea4130b991ebd5b0e8059117cae7548c6134e101477549e1ef29" exitCode=0 Apr 20 07:12:43.879421 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.879391 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" event={"ID":"6369b13c-a138-4ac6-9d0d-d934eecd618d","Type":"ContainerDied","Data":"7e080dcb62f0ea4130b991ebd5b0e8059117cae7548c6134e101477549e1ef29"} Apr 20 07:12:43.950882 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.950846 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb"] Apr 20 07:12:43.952950 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.952934 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:43.957369 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.957340 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:12:43.957369 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.957351 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tcmrx\"" Apr 20 07:12:43.957662 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.957458 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:12:43.957742 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.957727 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:12:43.957953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.957936 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:12:43.970811 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:43.970785 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb"] Apr 20 07:12:44.060828 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.060793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx44z\" (UniqueName: \"kubernetes.io/projected/da92fd91-eb57-4de3-811b-ae2d4ce3c365-kube-api-access-tx44z\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.061000 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.060837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.061000 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.060919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.162262 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.162168 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx44z\" (UniqueName: \"kubernetes.io/projected/da92fd91-eb57-4de3-811b-ae2d4ce3c365-kube-api-access-tx44z\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.162262 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.162214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.162262 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.162248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.164901 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.164862 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.165017 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.164912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da92fd91-eb57-4de3-811b-ae2d4ce3c365-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.181237 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.181208 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx44z\" (UniqueName: \"kubernetes.io/projected/da92fd91-eb57-4de3-811b-ae2d4ce3c365-kube-api-access-tx44z\") pod \"opendatahub-operator-controller-manager-6d65d76454-wtqxb\" (UID: \"da92fd91-eb57-4de3-811b-ae2d4ce3c365\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.262673 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.262641 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:44.396050 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.396021 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb"] Apr 20 07:12:44.398159 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:12:44.398126 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda92fd91_eb57_4de3_811b_ae2d4ce3c365.slice/crio-5cf0be4041ce62d74126e22fb47e58b4d684470fb7cef6d6242ab43832e14dd8 WatchSource:0}: Error finding container 5cf0be4041ce62d74126e22fb47e58b4d684470fb7cef6d6242ab43832e14dd8: Status 404 returned error can't find the container with id 5cf0be4041ce62d74126e22fb47e58b4d684470fb7cef6d6242ab43832e14dd8 Apr 20 07:12:44.885118 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.885073 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" event={"ID":"da92fd91-eb57-4de3-811b-ae2d4ce3c365","Type":"ContainerStarted","Data":"5cf0be4041ce62d74126e22fb47e58b4d684470fb7cef6d6242ab43832e14dd8"} Apr 20 07:12:44.887037 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.886990 2566 generic.go:358] "Generic (PLEG): container finished" podID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerID="ceaa3b4cd2463c8fcd3e6062db3ea7d8bec04c3ca042365c31218fc651fff3bc" exitCode=0 Apr 20 07:12:44.887167 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:44.887029 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" event={"ID":"6369b13c-a138-4ac6-9d0d-d934eecd618d","Type":"ContainerDied","Data":"ceaa3b4cd2463c8fcd3e6062db3ea7d8bec04c3ca042365c31218fc651fff3bc"} Apr 20 07:12:45.892914 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:45.892852 2566 generic.go:358] "Generic (PLEG): container finished" podID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerID="5105816f8443bcc03d0c59c6969b8771fa31d75b849f44f0ab595f6dd6903d92" exitCode=0 Apr 20 07:12:45.893370 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:45.892948 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" event={"ID":"6369b13c-a138-4ac6-9d0d-d934eecd618d","Type":"ContainerDied","Data":"5105816f8443bcc03d0c59c6969b8771fa31d75b849f44f0ab595f6dd6903d92"} Apr 20 07:12:47.038939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.038917 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:47.087188 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.087157 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqp5z\" (UniqueName: \"kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z\") pod \"6369b13c-a138-4ac6-9d0d-d934eecd618d\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " Apr 20 07:12:47.087381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.087215 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle\") pod \"6369b13c-a138-4ac6-9d0d-d934eecd618d\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " Apr 20 07:12:47.087381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.087244 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util\") pod \"6369b13c-a138-4ac6-9d0d-d934eecd618d\" (UID: \"6369b13c-a138-4ac6-9d0d-d934eecd618d\") " Apr 20 07:12:47.087993 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.087960 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle" (OuterVolumeSpecName: "bundle") pod "6369b13c-a138-4ac6-9d0d-d934eecd618d" (UID: "6369b13c-a138-4ac6-9d0d-d934eecd618d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:47.089589 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.089565 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z" (OuterVolumeSpecName: "kube-api-access-xqp5z") pod "6369b13c-a138-4ac6-9d0d-d934eecd618d" (UID: "6369b13c-a138-4ac6-9d0d-d934eecd618d"). InnerVolumeSpecName "kube-api-access-xqp5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:12:47.092640 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.092615 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util" (OuterVolumeSpecName: "util") pod "6369b13c-a138-4ac6-9d0d-d934eecd618d" (UID: "6369b13c-a138-4ac6-9d0d-d934eecd618d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:12:47.188603 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.188581 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:47.188686 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.188605 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6369b13c-a138-4ac6-9d0d-d934eecd618d-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:47.188686 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.188615 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqp5z\" (UniqueName: \"kubernetes.io/projected/6369b13c-a138-4ac6-9d0d-d934eecd618d-kube-api-access-xqp5z\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:12:47.901630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.901597 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" Apr 20 07:12:47.901792 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.901598 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jc29f" event={"ID":"6369b13c-a138-4ac6-9d0d-d934eecd618d","Type":"ContainerDied","Data":"165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24"} Apr 20 07:12:47.901792 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.901715 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165e514952903e19528e689c354e43bc265adb8ef92f6783a37c3600a20c0b24" Apr 20 07:12:47.905996 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.905968 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" event={"ID":"da92fd91-eb57-4de3-811b-ae2d4ce3c365","Type":"ContainerStarted","Data":"d15ae1806b18a1807b20f7de721d0f884240ffd65bf5eb7c80a84a937c7050f7"} Apr 20 07:12:47.906168 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.906084 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:12:47.928759 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:47.928710 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" podStartSLOduration=2.200886364 podStartE2EDuration="4.928694728s" podCreationTimestamp="2026-04-20 07:12:43 +0000 UTC" firstStartedPulling="2026-04-20 07:12:44.399877754 +0000 UTC m=+607.043186882" lastFinishedPulling="2026-04-20 07:12:47.127686117 +0000 UTC m=+609.770995246" observedRunningTime="2026-04-20 07:12:47.92638291 +0000 UTC m=+610.569692075" watchObservedRunningTime="2026-04-20 07:12:47.928694728 +0000 UTC m=+610.572003877" Apr 20 07:12:58.911732 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:12:58.911652 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-wtqxb" Apr 20 07:13:02.262701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.262661 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8"] Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263007 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="pull" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263020 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="pull" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263033 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="extract" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263039 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="extract" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263072 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="util" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263081 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="util" Apr 20 07:13:02.263170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.263135 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6369b13c-a138-4ac6-9d0d-d934eecd618d" containerName="extract" Apr 20 07:13:02.266287 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.266268 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.269952 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.269928 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:13:02.271037 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.271015 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:13:02.271182 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.271022 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:13:02.277123 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.277095 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8"] Apr 20 07:13:02.417859 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.417816 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.418039 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.417876 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.418039 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.417953 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhvj\" (UniqueName: \"kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.519212 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.519111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhvj\" (UniqueName: \"kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.519212 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.519187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.519410 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.519225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.519673 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.519650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.519714 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.519661 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.539540 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.539505 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhvj\" (UniqueName: \"kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.576157 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.576114 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:02.720471 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.720434 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8"] Apr 20 07:13:02.721585 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:02.721559 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c929536_cf7a_477d_b280_ac0b8afe40fb.slice/crio-1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9 WatchSource:0}: Error finding container 1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9: Status 404 returned error can't find the container with id 1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9 Apr 20 07:13:02.958324 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.958290 2566 generic.go:358] "Generic (PLEG): container finished" podID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerID="53b3d73b43fb2a68f612d465ad53b2734d4944e376bc42b81c14b90f89d7bb20" exitCode=0 Apr 20 07:13:02.958511 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.958377 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" event={"ID":"0c929536-cf7a-477d-b280-ac0b8afe40fb","Type":"ContainerDied","Data":"53b3d73b43fb2a68f612d465ad53b2734d4944e376bc42b81c14b90f89d7bb20"} Apr 20 07:13:02.958511 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:02.958427 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" event={"ID":"0c929536-cf7a-477d-b280-ac0b8afe40fb","Type":"ContainerStarted","Data":"1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9"} Apr 20 07:13:03.963280 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:03.963181 2566 generic.go:358] "Generic (PLEG): container finished" podID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerID="eab175954a4c4150c1928186b1076400b7251c1330d1f4701d6abb705d12475a" exitCode=0 Apr 20 07:13:03.963280 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:03.963228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" event={"ID":"0c929536-cf7a-477d-b280-ac0b8afe40fb","Type":"ContainerDied","Data":"eab175954a4c4150c1928186b1076400b7251c1330d1f4701d6abb705d12475a"} Apr 20 07:13:04.185942 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.185910 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7"] Apr 20 07:13:04.189323 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.189301 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.195576 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.195553 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fqh7x\"" Apr 20 07:13:04.195732 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.195686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 07:13:04.195790 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.195766 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 07:13:04.212213 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.212176 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7"] Apr 20 07:13:04.335169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.335134 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7cb3dae-237e-4f51-9470-b5f9f01164df-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.335337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.335198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7cb3dae-237e-4f51-9470-b5f9f01164df-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.335337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.335231 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zl5\" (UniqueName: \"kubernetes.io/projected/c7cb3dae-237e-4f51-9470-b5f9f01164df-kube-api-access-s9zl5\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.436565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.436520 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7cb3dae-237e-4f51-9470-b5f9f01164df-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.436754 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.436576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7cb3dae-237e-4f51-9470-b5f9f01164df-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.436754 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.436596 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zl5\" (UniqueName: \"kubernetes.io/projected/c7cb3dae-237e-4f51-9470-b5f9f01164df-kube-api-access-s9zl5\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.438990 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.438966 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7cb3dae-237e-4f51-9470-b5f9f01164df-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.439248 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.439229 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c7cb3dae-237e-4f51-9470-b5f9f01164df-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.448345 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.448321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zl5\" (UniqueName: \"kubernetes.io/projected/c7cb3dae-237e-4f51-9470-b5f9f01164df-kube-api-access-s9zl5\") pod \"kube-auth-proxy-6bbb6d54d8-28vx7\" (UID: \"c7cb3dae-237e-4f51-9470-b5f9f01164df\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.558849 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.558758 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" Apr 20 07:13:04.702887 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.702858 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7"] Apr 20 07:13:04.704602 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:04.704570 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7cb3dae_237e_4f51_9470_b5f9f01164df.slice/crio-02c3fd24fa97dced885ca999f388ae48a165d9b48a0c6215a7feacd51b4675e8 WatchSource:0}: Error finding container 02c3fd24fa97dced885ca999f388ae48a165d9b48a0c6215a7feacd51b4675e8: Status 404 returned error can't find the container with id 02c3fd24fa97dced885ca999f388ae48a165d9b48a0c6215a7feacd51b4675e8 Apr 20 07:13:04.969326 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.969246 2566 generic.go:358] "Generic (PLEG): container finished" podID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerID="41d101b561ec609a94bc9bde8dec76136e9bf9b1116ea166db2d1b0b400abe63" exitCode=0 Apr 20 07:13:04.969750 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.969360 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" event={"ID":"0c929536-cf7a-477d-b280-ac0b8afe40fb","Type":"ContainerDied","Data":"41d101b561ec609a94bc9bde8dec76136e9bf9b1116ea166db2d1b0b400abe63"} Apr 20 07:13:04.970497 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:04.970478 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" event={"ID":"c7cb3dae-237e-4f51-9470-b5f9f01164df","Type":"ContainerStarted","Data":"02c3fd24fa97dced885ca999f388ae48a165d9b48a0c6215a7feacd51b4675e8"} Apr 20 07:13:05.740752 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.740710 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dj5xb"] Apr 20 07:13:05.744257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.744229 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:05.749481 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.749239 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 07:13:05.749923 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.749898 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-g4pn9\"" Apr 20 07:13:05.757130 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.756994 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dj5xb"] Apr 20 07:13:05.849925 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.849878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:05.850127 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.849931 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclv9\" (UniqueName: \"kubernetes.io/projected/45a6d0d6-b302-4436-9926-60275a4b48db-kube-api-access-rclv9\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:05.951741 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.951693 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:05.951896 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.951756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rclv9\" (UniqueName: \"kubernetes.io/projected/45a6d0d6-b302-4436-9926-60275a4b48db-kube-api-access-rclv9\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:05.951896 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:05.951839 2566 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 07:13:05.951959 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:05.951910 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert podName:45a6d0d6-b302-4436-9926-60275a4b48db nodeName:}" failed. No retries permitted until 2026-04-20 07:13:06.451892526 +0000 UTC m=+629.095201659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert") pod "odh-model-controller-858dbf95b8-dj5xb" (UID: "45a6d0d6-b302-4436-9926-60275a4b48db") : secret "odh-model-controller-webhook-cert" not found Apr 20 07:13:05.968476 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:05.968441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclv9\" (UniqueName: \"kubernetes.io/projected/45a6d0d6-b302-4436-9926-60275a4b48db-kube-api-access-rclv9\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:06.457855 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.457755 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:06.461018 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.460987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45a6d0d6-b302-4436-9926-60275a4b48db-cert\") pod \"odh-model-controller-858dbf95b8-dj5xb\" (UID: \"45a6d0d6-b302-4436-9926-60275a4b48db\") " pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:06.660086 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.660034 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:06.717015 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.716917 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:06.846308 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.846273 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dj5xb"] Apr 20 07:13:06.850143 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:06.850108 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a6d0d6_b302_4436_9926_60275a4b48db.slice/crio-37b59890a3aed0b8e47744516ba31d524a36f948091c1144bc4433456710a699 WatchSource:0}: Error finding container 37b59890a3aed0b8e47744516ba31d524a36f948091c1144bc4433456710a699: Status 404 returned error can't find the container with id 37b59890a3aed0b8e47744516ba31d524a36f948091c1144bc4433456710a699 Apr 20 07:13:06.861296 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.861260 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqhvj\" (UniqueName: \"kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj\") pod \"0c929536-cf7a-477d-b280-ac0b8afe40fb\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " Apr 20 07:13:06.861424 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.861358 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle\") pod \"0c929536-cf7a-477d-b280-ac0b8afe40fb\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " Apr 20 07:13:06.861424 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.861413 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util\") pod \"0c929536-cf7a-477d-b280-ac0b8afe40fb\" (UID: \"0c929536-cf7a-477d-b280-ac0b8afe40fb\") " Apr 20 07:13:06.862186 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.862159 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle" (OuterVolumeSpecName: "bundle") pod "0c929536-cf7a-477d-b280-ac0b8afe40fb" (UID: "0c929536-cf7a-477d-b280-ac0b8afe40fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:13:06.863463 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.863439 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj" (OuterVolumeSpecName: "kube-api-access-gqhvj") pod "0c929536-cf7a-477d-b280-ac0b8afe40fb" (UID: "0c929536-cf7a-477d-b280-ac0b8afe40fb"). InnerVolumeSpecName "kube-api-access-gqhvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:13:06.866839 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.866811 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util" (OuterVolumeSpecName: "util") pod "0c929536-cf7a-477d-b280-ac0b8afe40fb" (UID: "0c929536-cf7a-477d-b280-ac0b8afe40fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:13:06.962517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.962475 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:06.962517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.962511 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqhvj\" (UniqueName: \"kubernetes.io/projected/0c929536-cf7a-477d-b280-ac0b8afe40fb-kube-api-access-gqhvj\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:06.962517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.962521 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c929536-cf7a-477d-b280-ac0b8afe40fb-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:06.979591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.979552 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" event={"ID":"0c929536-cf7a-477d-b280-ac0b8afe40fb","Type":"ContainerDied","Data":"1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9"} Apr 20 07:13:06.979591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.979576 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hr4z8" Apr 20 07:13:06.979591 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.979591 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a950180ab84689d7d8663e9763af62c56afe33f56324fe64c41a5582a0c5ad9" Apr 20 07:13:06.980726 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:06.980697 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" event={"ID":"45a6d0d6-b302-4436-9926-60275a4b48db","Type":"ContainerStarted","Data":"37b59890a3aed0b8e47744516ba31d524a36f948091c1144bc4433456710a699"} Apr 20 07:13:09.995537 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:09.995452 2566 generic.go:358] "Generic (PLEG): container finished" podID="45a6d0d6-b302-4436-9926-60275a4b48db" containerID="79390a312267c292590e97bfea987eee02e21f7626879e86bf7461a88e2521cf" exitCode=1 Apr 20 07:13:09.995942 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:09.995520 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" event={"ID":"45a6d0d6-b302-4436-9926-60275a4b48db","Type":"ContainerDied","Data":"79390a312267c292590e97bfea987eee02e21f7626879e86bf7461a88e2521cf"} Apr 20 07:13:09.995942 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:09.995742 2566 scope.go:117] "RemoveContainer" containerID="79390a312267c292590e97bfea987eee02e21f7626879e86bf7461a88e2521cf" Apr 20 07:13:09.997244 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:09.997214 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" event={"ID":"c7cb3dae-237e-4f51-9470-b5f9f01164df","Type":"ContainerStarted","Data":"4d98efac3034b10bb49e8dcc8fa0ba7168b37eeb4630f567f2c4c2c068b70cc6"} Apr 20 07:13:10.081204 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:10.081146 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-28vx7" podStartSLOduration=1.130706308 podStartE2EDuration="6.081128646s" podCreationTimestamp="2026-04-20 07:13:04 +0000 UTC" firstStartedPulling="2026-04-20 07:13:04.706453987 +0000 UTC m=+627.349763115" lastFinishedPulling="2026-04-20 07:13:09.656876324 +0000 UTC m=+632.300185453" observedRunningTime="2026-04-20 07:13:10.0802656 +0000 UTC m=+632.723574753" watchObservedRunningTime="2026-04-20 07:13:10.081128646 +0000 UTC m=+632.724437796" Apr 20 07:13:11.002895 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.002861 2566 generic.go:358] "Generic (PLEG): container finished" podID="45a6d0d6-b302-4436-9926-60275a4b48db" containerID="59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16" exitCode=1 Apr 20 07:13:11.003330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.002950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" event={"ID":"45a6d0d6-b302-4436-9926-60275a4b48db","Type":"ContainerDied","Data":"59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16"} Apr 20 07:13:11.003330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.002996 2566 scope.go:117] "RemoveContainer" containerID="79390a312267c292590e97bfea987eee02e21f7626879e86bf7461a88e2521cf" Apr 20 07:13:11.003330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.003214 2566 scope.go:117] "RemoveContainer" containerID="59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16" Apr 20 07:13:11.003492 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:11.003440 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dj5xb_opendatahub(45a6d0d6-b302-4436-9926-60275a4b48db)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" podUID="45a6d0d6-b302-4436-9926-60275a4b48db" Apr 20 07:13:11.380337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380257 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ntjtp"] Apr 20 07:13:11.380623 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380611 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="pull" Apr 20 07:13:11.380664 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380625 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="pull" Apr 20 07:13:11.380664 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380641 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="extract" Apr 20 07:13:11.380664 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380646 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="extract" Apr 20 07:13:11.380749 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380663 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="util" Apr 20 07:13:11.380749 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380672 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="util" Apr 20 07:13:11.380749 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.380736 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c929536-cf7a-477d-b280-ac0b8afe40fb" containerName="extract" Apr 20 07:13:11.384950 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.384927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:11.389053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.389033 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 07:13:11.389168 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.389144 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-xldwg\"" Apr 20 07:13:11.410717 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.410695 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ntjtp"] Apr 20 07:13:11.502031 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.501994 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xb4\" (UniqueName: \"kubernetes.io/projected/7be0292d-114f-4565-a66f-a94b5341413c-kube-api-access-s6xb4\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:11.502205 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.502080 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:11.602739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.602708 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:11.602886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.602779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xb4\" (UniqueName: \"kubernetes.io/projected/7be0292d-114f-4565-a66f-a94b5341413c-kube-api-access-s6xb4\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:11.602886 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:11.602876 2566 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 07:13:11.602967 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:11.602945 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert podName:7be0292d-114f-4565-a66f-a94b5341413c nodeName:}" failed. No retries permitted until 2026-04-20 07:13:12.102929593 +0000 UTC m=+634.746238721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert") pod "kserve-controller-manager-856948b99f-ntjtp" (UID: "7be0292d-114f-4565-a66f-a94b5341413c") : secret "kserve-webhook-server-cert" not found Apr 20 07:13:11.621462 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:11.621432 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xb4\" (UniqueName: \"kubernetes.io/projected/7be0292d-114f-4565-a66f-a94b5341413c-kube-api-access-s6xb4\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:12.007828 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:12.007802 2566 scope.go:117] "RemoveContainer" containerID="59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16" Apr 20 07:13:12.008218 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:12.007976 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dj5xb_opendatahub(45a6d0d6-b302-4436-9926-60275a4b48db)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" podUID="45a6d0d6-b302-4436-9926-60275a4b48db" Apr 20 07:13:12.107305 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:12.107269 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:12.109849 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:12.109818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7be0292d-114f-4565-a66f-a94b5341413c-cert\") pod \"kserve-controller-manager-856948b99f-ntjtp\" (UID: \"7be0292d-114f-4565-a66f-a94b5341413c\") " pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:12.294802 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:12.294687 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:12.424248 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:12.424215 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ntjtp"] Apr 20 07:13:12.425561 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:12.425523 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be0292d_114f_4565_a66f_a94b5341413c.slice/crio-7d0b40a509964b3637a5e7913ecce4d682858e08bdc1de011705d5bbe656f8c6 WatchSource:0}: Error finding container 7d0b40a509964b3637a5e7913ecce4d682858e08bdc1de011705d5bbe656f8c6: Status 404 returned error can't find the container with id 7d0b40a509964b3637a5e7913ecce4d682858e08bdc1de011705d5bbe656f8c6 Apr 20 07:13:13.015585 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:13.015548 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" event={"ID":"7be0292d-114f-4565-a66f-a94b5341413c","Type":"ContainerStarted","Data":"7d0b40a509964b3637a5e7913ecce4d682858e08bdc1de011705d5bbe656f8c6"} Apr 20 07:13:16.028145 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:16.028103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" event={"ID":"7be0292d-114f-4565-a66f-a94b5341413c","Type":"ContainerStarted","Data":"d2dd231a26baed18674b0dd86130c999583ed79f4419c6b54684dd088d945748"} Apr 20 07:13:16.028509 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:16.028167 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:13:16.111816 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:16.111759 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" podStartSLOduration=2.437734005 podStartE2EDuration="5.111742002s" podCreationTimestamp="2026-04-20 07:13:11 +0000 UTC" firstStartedPulling="2026-04-20 07:13:12.426822021 +0000 UTC m=+635.070131150" lastFinishedPulling="2026-04-20 07:13:15.100830006 +0000 UTC m=+637.744139147" observedRunningTime="2026-04-20 07:13:16.110240452 +0000 UTC m=+638.753549603" watchObservedRunningTime="2026-04-20 07:13:16.111742002 +0000 UTC m=+638.755051153" Apr 20 07:13:16.660431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:16.660374 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:16.660767 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:16.660753 2566 scope.go:117] "RemoveContainer" containerID="59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16" Apr 20 07:13:16.660953 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:13:16.660937 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dj5xb_opendatahub(45a6d0d6-b302-4436-9926-60275a4b48db)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" podUID="45a6d0d6-b302-4436-9926-60275a4b48db" Apr 20 07:13:17.523550 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.523511 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9"] Apr 20 07:13:17.527109 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.527090 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.547735 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.547709 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:13:17.547859 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.547761 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:13:17.548703 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.548687 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lzfnq\"" Apr 20 07:13:17.642129 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.642095 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9"] Apr 20 07:13:17.653565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.653535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.653701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.653595 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvbg\" (UniqueName: \"kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.653701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.653621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.754741 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.754697 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.754947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.754749 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvbg\" (UniqueName: \"kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.754947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.754778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.755252 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.755231 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.755314 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.755230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.774171 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.774098 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvbg\" (UniqueName: \"kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:17.835829 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:17.835792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:18.034178 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:18.034152 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9"] Apr 20 07:13:18.035915 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:18.035885 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770f40cc_ab4a_4e19_a31e_69583730b499.slice/crio-29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f WatchSource:0}: Error finding container 29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f: Status 404 returned error can't find the container with id 29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f Apr 20 07:13:19.040021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.039929 2566 generic.go:358] "Generic (PLEG): container finished" podID="770f40cc-ab4a-4e19-a31e-69583730b499" containerID="5abf304f0bea494ce9f2f8b48ba85affebdfb16a40818fafed04e88cd520967c" exitCode=0 Apr 20 07:13:19.040021 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.039995 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" event={"ID":"770f40cc-ab4a-4e19-a31e-69583730b499","Type":"ContainerDied","Data":"5abf304f0bea494ce9f2f8b48ba85affebdfb16a40818fafed04e88cd520967c"} Apr 20 07:13:19.040499 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.040023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" event={"ID":"770f40cc-ab4a-4e19-a31e-69583730b499","Type":"ContainerStarted","Data":"29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f"} Apr 20 07:13:19.885438 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.885402 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vpffh"] Apr 20 07:13:19.888714 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.888697 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:19.900689 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.900661 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 07:13:19.900817 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.900744 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-sb5rw\"" Apr 20 07:13:19.901511 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.901496 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 07:13:19.920804 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.920768 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vpffh"] Apr 20 07:13:19.971441 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.971352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8dg\" (UniqueName: \"kubernetes.io/projected/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-kube-api-access-9t8dg\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:19.971441 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:19.971396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.045193 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.045158 2566 generic.go:358] "Generic (PLEG): container finished" podID="770f40cc-ab4a-4e19-a31e-69583730b499" containerID="f331ae93fa79806d79d183ef5e5162cafc03d0e225add4e3368d69da59eefc5a" exitCode=0 Apr 20 07:13:20.045589 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.045206 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" event={"ID":"770f40cc-ab4a-4e19-a31e-69583730b499","Type":"ContainerDied","Data":"f331ae93fa79806d79d183ef5e5162cafc03d0e225add4e3368d69da59eefc5a"} Apr 20 07:13:20.072048 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.072023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8dg\" (UniqueName: \"kubernetes.io/projected/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-kube-api-access-9t8dg\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.072151 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.072079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.074762 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.074735 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.103044 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.103013 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8dg\" (UniqueName: \"kubernetes.io/projected/2b8bda74-0569-45f8-8f82-dbc4a41bd4d8-kube-api-access-9t8dg\") pod \"servicemesh-operator3-55f49c5f94-vpffh\" (UID: \"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.197544 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.197507 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:20.429979 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:20.429954 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vpffh"] Apr 20 07:13:20.432756 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:20.432728 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8bda74_0569_45f8_8f82_dbc4a41bd4d8.slice/crio-b29d0641d4cc166a2596dfeb980a53ef4cc18e788519d5234dff257d6eae16c0 WatchSource:0}: Error finding container b29d0641d4cc166a2596dfeb980a53ef4cc18e788519d5234dff257d6eae16c0: Status 404 returned error can't find the container with id b29d0641d4cc166a2596dfeb980a53ef4cc18e788519d5234dff257d6eae16c0 Apr 20 07:13:21.053713 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:21.053675 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" event={"ID":"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8","Type":"ContainerStarted","Data":"b29d0641d4cc166a2596dfeb980a53ef4cc18e788519d5234dff257d6eae16c0"} Apr 20 07:13:21.055416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:21.055392 2566 generic.go:358] "Generic (PLEG): container finished" podID="770f40cc-ab4a-4e19-a31e-69583730b499" containerID="3ba97efea9ade21d503b1157f8886426bd42654bbe3423888ff12f35b366438b" exitCode=0 Apr 20 07:13:21.055545 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:21.055438 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" event={"ID":"770f40cc-ab4a-4e19-a31e-69583730b499","Type":"ContainerDied","Data":"3ba97efea9ade21d503b1157f8886426bd42654bbe3423888ff12f35b366438b"} Apr 20 07:13:22.919274 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.919246 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:22.996285 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.996256 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhvbg\" (UniqueName: \"kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg\") pod \"770f40cc-ab4a-4e19-a31e-69583730b499\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " Apr 20 07:13:22.996434 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.996314 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util\") pod \"770f40cc-ab4a-4e19-a31e-69583730b499\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " Apr 20 07:13:22.996434 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.996335 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle\") pod \"770f40cc-ab4a-4e19-a31e-69583730b499\" (UID: \"770f40cc-ab4a-4e19-a31e-69583730b499\") " Apr 20 07:13:22.997299 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.997269 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle" (OuterVolumeSpecName: "bundle") pod "770f40cc-ab4a-4e19-a31e-69583730b499" (UID: "770f40cc-ab4a-4e19-a31e-69583730b499"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:13:22.998508 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:22.998479 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg" (OuterVolumeSpecName: "kube-api-access-jhvbg") pod "770f40cc-ab4a-4e19-a31e-69583730b499" (UID: "770f40cc-ab4a-4e19-a31e-69583730b499"). InnerVolumeSpecName "kube-api-access-jhvbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:13:23.001473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.001446 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util" (OuterVolumeSpecName: "util") pod "770f40cc-ab4a-4e19-a31e-69583730b499" (UID: "770f40cc-ab4a-4e19-a31e-69583730b499"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:13:23.066398 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.066359 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" event={"ID":"770f40cc-ab4a-4e19-a31e-69583730b499","Type":"ContainerDied","Data":"29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f"} Apr 20 07:13:23.066398 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.066404 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29af32796dcec80713e7457926a19cd7641f34017a3322601f7ee10f7e63549f" Apr 20 07:13:23.066638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.066373 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bmsn9" Apr 20 07:13:23.068082 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.068026 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" event={"ID":"2b8bda74-0569-45f8-8f82-dbc4a41bd4d8","Type":"ContainerStarted","Data":"3d22e4f2e396b983f1e10452c535ebbe4c89b4688bed1ef125d008e83b430070"} Apr 20 07:13:23.068226 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.068140 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:23.091912 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.091848 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" podStartSLOduration=1.557426231 podStartE2EDuration="4.09182865s" podCreationTimestamp="2026-04-20 07:13:19 +0000 UTC" firstStartedPulling="2026-04-20 07:13:20.435412072 +0000 UTC m=+643.078721200" lastFinishedPulling="2026-04-20 07:13:22.969814488 +0000 UTC m=+645.613123619" observedRunningTime="2026-04-20 07:13:23.090762048 +0000 UTC m=+645.734071199" watchObservedRunningTime="2026-04-20 07:13:23.09182865 +0000 UTC m=+645.735137803" Apr 20 07:13:23.097235 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.097207 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:23.097235 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.097238 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/770f40cc-ab4a-4e19-a31e-69583730b499-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:23.097417 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:23.097254 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhvbg\" (UniqueName: \"kubernetes.io/projected/770f40cc-ab4a-4e19-a31e-69583730b499-kube-api-access-jhvbg\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:13:26.660433 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:26.660387 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:26.660832 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:26.660788 2566 scope.go:117] "RemoveContainer" containerID="59241057bb03b377abc27a74e5e9e2627b919658bf854deeca23569f16d53d16" Apr 20 07:13:27.085424 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:27.085386 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" event={"ID":"45a6d0d6-b302-4436-9926-60275a4b48db","Type":"ContainerStarted","Data":"2f8a487e2c8ebcd77e37dede5030bb12083ee659b75142a79e883ccdb6bf7df3"} Apr 20 07:13:27.085597 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:27.085589 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:27.117143 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:27.117087 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" podStartSLOduration=2.048520839 podStartE2EDuration="22.117071545s" podCreationTimestamp="2026-04-20 07:13:05 +0000 UTC" firstStartedPulling="2026-04-20 07:13:06.851387868 +0000 UTC m=+629.494696997" lastFinishedPulling="2026-04-20 07:13:26.919938575 +0000 UTC m=+649.563247703" observedRunningTime="2026-04-20 07:13:27.114272618 +0000 UTC m=+649.757581768" watchObservedRunningTime="2026-04-20 07:13:27.117071545 +0000 UTC m=+649.760380685" Apr 20 07:13:34.075447 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.075410 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vpffh" Apr 20 07:13:34.396612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396521 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5"] Apr 20 07:13:34.396878 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396865 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="util" Apr 20 07:13:34.396935 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396880 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="util" Apr 20 07:13:34.396935 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396888 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="extract" Apr 20 07:13:34.396935 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396904 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="extract" Apr 20 07:13:34.396935 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396926 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="pull" Apr 20 07:13:34.396935 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396932 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="pull" Apr 20 07:13:34.397128 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.396987 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="770f40cc-ab4a-4e19-a31e-69583730b499" containerName="extract" Apr 20 07:13:34.401593 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.401561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.405662 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.405636 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 07:13:34.405834 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.405779 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 07:13:34.405939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.405902 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 07:13:34.406053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.406034 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-97hdg\"" Apr 20 07:13:34.406053 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.406045 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 07:13:34.413188 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.413164 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5"] Apr 20 07:13:34.494290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494252 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494290 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494297 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494329 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494411 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4tkx\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-kube-api-access-t4tkx\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.494542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.494518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596203 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.595792 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596365 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4tkx\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-kube-api-access-t4tkx\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596365 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596328 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596441 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596369 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596441 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596532 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596586 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.596586 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.596524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.599214 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.599184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.599357 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.599184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.599495 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.599476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.599630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.599613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.614770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.614720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4tkx\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-kube-api-access-t4tkx\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.616324 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.616291 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83244e9c-7094-4f0b-93a3-67dd81ac0ad6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mfzf5\" (UID: \"83244e9c-7094-4f0b-93a3-67dd81ac0ad6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.712156 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.712024 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:34.881323 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:34.881294 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5"] Apr 20 07:13:34.882238 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:13:34.882215 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83244e9c_7094_4f0b_93a3_67dd81ac0ad6.slice/crio-2766a8b2898143aa1334d8c67aa475c9fbd6c32451c5ab5a28aa310c71e8a47a WatchSource:0}: Error finding container 2766a8b2898143aa1334d8c67aa475c9fbd6c32451c5ab5a28aa310c71e8a47a: Status 404 returned error can't find the container with id 2766a8b2898143aa1334d8c67aa475c9fbd6c32451c5ab5a28aa310c71e8a47a Apr 20 07:13:35.121251 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:35.121199 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" event={"ID":"83244e9c-7094-4f0b-93a3-67dd81ac0ad6","Type":"ContainerStarted","Data":"2766a8b2898143aa1334d8c67aa475c9fbd6c32451c5ab5a28aa310c71e8a47a"} Apr 20 07:13:37.486939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:37.486901 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 07:13:37.487179 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:37.486973 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 07:13:38.094746 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.094715 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-dj5xb" Apr 20 07:13:38.137033 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.136992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" event={"ID":"83244e9c-7094-4f0b-93a3-67dd81ac0ad6","Type":"ContainerStarted","Data":"a58658d020d774e6cc773b2c7c5cc4854e3592b7dcd18c36a6073ff697ecef71"} Apr 20 07:13:38.137359 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.137338 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:38.138771 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.138737 2566 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-mfzf5 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 07:13:38.138897 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.138793 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" podUID="83244e9c-7094-4f0b-93a3-67dd81ac0ad6" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 07:13:38.164392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:38.164320 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" podStartSLOduration=1.56236482 podStartE2EDuration="4.164297978s" podCreationTimestamp="2026-04-20 07:13:34 +0000 UTC" firstStartedPulling="2026-04-20 07:13:34.884727581 +0000 UTC m=+657.528036709" lastFinishedPulling="2026-04-20 07:13:37.486660739 +0000 UTC m=+660.129969867" observedRunningTime="2026-04-20 07:13:38.161042112 +0000 UTC m=+660.804351261" watchObservedRunningTime="2026-04-20 07:13:38.164297978 +0000 UTC m=+660.807607132" Apr 20 07:13:39.141482 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:39.141440 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mfzf5" Apr 20 07:13:47.036632 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:13:47.036600 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-ntjtp" Apr 20 07:14:17.965161 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.965122 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb"] Apr 20 07:14:17.973967 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.973937 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:17.976427 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.976398 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb"] Apr 20 07:14:17.976588 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.976570 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:14:17.977696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.977674 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:14:17.977808 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:17.977752 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4sdjw\"" Apr 20 07:14:18.073418 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.073382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.073610 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.073452 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.073610 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.073490 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssxh\" (UniqueName: \"kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.174831 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.174796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.174831 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.174843 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jssxh\" (UniqueName: \"kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.175087 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.174913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.175344 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.175321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.175389 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.175335 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.184459 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.184432 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssxh\" (UniqueName: \"kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.285257 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.285221 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:18.417953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.417797 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb"] Apr 20 07:14:18.420465 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:18.420429 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a3245f_2a4e_4c26_8b51_a76e2c24331f.slice/crio-da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7 WatchSource:0}: Error finding container da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7: Status 404 returned error can't find the container with id da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7 Apr 20 07:14:18.422349 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.422329 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:14:18.558388 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.558299 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt"] Apr 20 07:14:18.561903 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.561879 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.571143 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.571107 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt"] Apr 20 07:14:18.577981 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.577953 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.578135 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.577992 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtmq\" (UniqueName: \"kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.578184 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.578126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.678966 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.678931 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.679163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.678983 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.679163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.679009 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtmq\" (UniqueName: \"kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.679378 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.679354 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.679480 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.679424 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.688443 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.688420 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtmq\" (UniqueName: \"kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.901952 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.901866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:18.952615 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.952590 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp"] Apr 20 07:14:18.957661 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.957642 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:18.963654 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.963616 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp"] Apr 20 07:14:18.981389 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.981361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:18.981884 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.981408 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:18.981884 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:18.981447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fl4n\" (UniqueName: \"kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.038396 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.038370 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt"] Apr 20 07:14:19.040047 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:19.040023 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c4f55e_7867_4831_b766_d0feea542b63.slice/crio-fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579 WatchSource:0}: Error finding container fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579: Status 404 returned error can't find the container with id fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579 Apr 20 07:14:19.082089 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.082033 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.082209 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.082101 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fl4n\" (UniqueName: \"kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.082209 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.082142 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.082499 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.082482 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.082550 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.082480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.090311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.090290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fl4n\" (UniqueName: \"kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.271542 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.271505 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:19.289656 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.289621 2566 generic.go:358] "Generic (PLEG): container finished" podID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerID="681c36b65ef9c482ee84eaa6c3ae47c1bcb13b77e2ddd7b65d8056793cd51a47" exitCode=0 Apr 20 07:14:19.289807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.289705 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" event={"ID":"30a3245f-2a4e-4c26-8b51-a76e2c24331f","Type":"ContainerDied","Data":"681c36b65ef9c482ee84eaa6c3ae47c1bcb13b77e2ddd7b65d8056793cd51a47"} Apr 20 07:14:19.289807 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.289752 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" event={"ID":"30a3245f-2a4e-4c26-8b51-a76e2c24331f","Type":"ContainerStarted","Data":"da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7"} Apr 20 07:14:19.291547 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.291522 2566 generic.go:358] "Generic (PLEG): container finished" podID="72c4f55e-7867-4831-b766-d0feea542b63" containerID="4ac636c43650a7efea5ee1e0d795426c28de8ac78b84223bf1273af4af87a301" exitCode=0 Apr 20 07:14:19.291655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.291583 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" event={"ID":"72c4f55e-7867-4831-b766-d0feea542b63","Type":"ContainerDied","Data":"4ac636c43650a7efea5ee1e0d795426c28de8ac78b84223bf1273af4af87a301"} Apr 20 07:14:19.291655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.291605 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" event={"ID":"72c4f55e-7867-4831-b766-d0feea542b63","Type":"ContainerStarted","Data":"fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579"} Apr 20 07:14:19.365634 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.365589 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96"] Apr 20 07:14:19.370571 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.370547 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.379416 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.379386 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96"] Apr 20 07:14:19.384600 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.384568 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.384754 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.384737 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.384808 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.384788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bnm6\" (UniqueName: \"kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.408671 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.408649 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp"] Apr 20 07:14:19.410743 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:19.410717 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec28d2fc_0988_4fc1_8296_e35886f33ba9.slice/crio-a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41 WatchSource:0}: Error finding container a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41: Status 404 returned error can't find the container with id a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41 Apr 20 07:14:19.485939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.485455 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.486302 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.485880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.486800 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.486455 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bnm6\" (UniqueName: \"kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.487427 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.487027 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.487427 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.487378 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.494461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.494432 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5776bd786f-mftdz"] Apr 20 07:14:19.498622 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.498522 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.503094 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.503052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bnm6\" (UniqueName: \"kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.509731 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.509418 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5776bd786f-mftdz"] Apr 20 07:14:19.588953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.588913 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.588953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.588949 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-oauth-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.589173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.588973 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-console-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.589173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.588991 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-oauth-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.589173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.589097 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-service-ca\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.589173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.589133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-trusted-ca-bundle\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.589173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.589152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lv4\" (UniqueName: \"kubernetes.io/projected/18482489-ee72-4ec2-be2a-db333374296c-kube-api-access-k4lv4\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.683555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.683519 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:19.689617 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689588 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-console-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689743 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689624 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-oauth-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689743 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-service-ca\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689743 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-trusted-ca-bundle\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689743 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689702 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lv4\" (UniqueName: \"kubernetes.io/projected/18482489-ee72-4ec2-be2a-db333374296c-kube-api-access-k4lv4\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689958 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.689958 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.689846 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-oauth-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.690855 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.690405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-oauth-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.690855 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.690491 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-console-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.690855 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.690645 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-service-ca\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.691104 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.691021 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18482489-ee72-4ec2-be2a-db333374296c-trusted-ca-bundle\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.692688 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.692669 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-oauth-config\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.692789 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.692766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18482489-ee72-4ec2-be2a-db333374296c-console-serving-cert\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.699763 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.699738 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lv4\" (UniqueName: \"kubernetes.io/projected/18482489-ee72-4ec2-be2a-db333374296c-kube-api-access-k4lv4\") pod \"console-5776bd786f-mftdz\" (UID: \"18482489-ee72-4ec2-be2a-db333374296c\") " pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.814824 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.814786 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:19.831026 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.830998 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96"] Apr 20 07:14:19.833853 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:19.833818 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84adcb3_4653_4722_8577_04d06c6c5a59.slice/crio-8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad WatchSource:0}: Error finding container 8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad: Status 404 returned error can't find the container with id 8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad Apr 20 07:14:19.962808 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:19.962778 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5776bd786f-mftdz"] Apr 20 07:14:20.058853 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:20.058814 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18482489_ee72_4ec2_be2a_db333374296c.slice/crio-059ceb9ee168691f623a554e8b13e2eac45445af227d106049a97eb83f33139d WatchSource:0}: Error finding container 059ceb9ee168691f623a554e8b13e2eac45445af227d106049a97eb83f33139d: Status 404 returned error can't find the container with id 059ceb9ee168691f623a554e8b13e2eac45445af227d106049a97eb83f33139d Apr 20 07:14:20.296925 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.296883 2566 generic.go:358] "Generic (PLEG): container finished" podID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerID="4179a29bb9872b03d8c103f6510970585253f92b17eb19fec980577ef913d1d7" exitCode=0 Apr 20 07:14:20.297106 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.296972 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" event={"ID":"f84adcb3-4653-4722-8577-04d06c6c5a59","Type":"ContainerDied","Data":"4179a29bb9872b03d8c103f6510970585253f92b17eb19fec980577ef913d1d7"} Apr 20 07:14:20.297106 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.297020 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" event={"ID":"f84adcb3-4653-4722-8577-04d06c6c5a59","Type":"ContainerStarted","Data":"8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad"} Apr 20 07:14:20.298829 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.298803 2566 generic.go:358] "Generic (PLEG): container finished" podID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerID="d9dc7df30ad51e301619f7ddb3292055ed87559d29fb5c4e459688191b5b6cd6" exitCode=0 Apr 20 07:14:20.298961 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.298896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" event={"ID":"30a3245f-2a4e-4c26-8b51-a76e2c24331f","Type":"ContainerDied","Data":"d9dc7df30ad51e301619f7ddb3292055ed87559d29fb5c4e459688191b5b6cd6"} Apr 20 07:14:20.300608 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.300569 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5776bd786f-mftdz" event={"ID":"18482489-ee72-4ec2-be2a-db333374296c","Type":"ContainerStarted","Data":"993d576384b5ea3f8934766fd989967ce06561c3aee4f6c35707ce6f37cc189c"} Apr 20 07:14:20.300608 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.300592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5776bd786f-mftdz" event={"ID":"18482489-ee72-4ec2-be2a-db333374296c","Type":"ContainerStarted","Data":"059ceb9ee168691f623a554e8b13e2eac45445af227d106049a97eb83f33139d"} Apr 20 07:14:20.301917 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.301893 2566 generic.go:358] "Generic (PLEG): container finished" podID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerID="8681e6121249cb97b5e63102ae6517724ad9871a28620035caff4850e034a511" exitCode=0 Apr 20 07:14:20.302005 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.301924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" event={"ID":"ec28d2fc-0988-4fc1-8296-e35886f33ba9","Type":"ContainerDied","Data":"8681e6121249cb97b5e63102ae6517724ad9871a28620035caff4850e034a511"} Apr 20 07:14:20.302005 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.301940 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" event={"ID":"ec28d2fc-0988-4fc1-8296-e35886f33ba9","Type":"ContainerStarted","Data":"a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41"} Apr 20 07:14:20.360224 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:20.360164 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5776bd786f-mftdz" podStartSLOduration=1.360148353 podStartE2EDuration="1.360148353s" podCreationTimestamp="2026-04-20 07:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:14:20.358719846 +0000 UTC m=+703.002028998" watchObservedRunningTime="2026-04-20 07:14:20.360148353 +0000 UTC m=+703.003457503" Apr 20 07:14:21.308460 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.308424 2566 generic.go:358] "Generic (PLEG): container finished" podID="72c4f55e-7867-4831-b766-d0feea542b63" containerID="f4d371543ed92785793b2c7acd115ea804813ccf7342257bc45afccc43fcf27c" exitCode=0 Apr 20 07:14:21.308873 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.308503 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" event={"ID":"72c4f55e-7867-4831-b766-d0feea542b63","Type":"ContainerDied","Data":"f4d371543ed92785793b2c7acd115ea804813ccf7342257bc45afccc43fcf27c"} Apr 20 07:14:21.310426 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.310407 2566 generic.go:358] "Generic (PLEG): container finished" podID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerID="902b615585f4434531e6950609e3cb72786dca2d14f9d1f575a3d4e373260207" exitCode=0 Apr 20 07:14:21.310527 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.310502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" event={"ID":"ec28d2fc-0988-4fc1-8296-e35886f33ba9","Type":"ContainerDied","Data":"902b615585f4434531e6950609e3cb72786dca2d14f9d1f575a3d4e373260207"} Apr 20 07:14:21.312173 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.312146 2566 generic.go:358] "Generic (PLEG): container finished" podID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerID="0df02919b438427fad211d261d28304a97c3b0da6f6d4796802079862dc12ed6" exitCode=0 Apr 20 07:14:21.312276 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.312229 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" event={"ID":"f84adcb3-4653-4722-8577-04d06c6c5a59","Type":"ContainerDied","Data":"0df02919b438427fad211d261d28304a97c3b0da6f6d4796802079862dc12ed6"} Apr 20 07:14:21.314421 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.314393 2566 generic.go:358] "Generic (PLEG): container finished" podID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerID="1bf799917ef2f5264f39f566b64809d4ccb243b8c5a239a505048fa53afefb08" exitCode=0 Apr 20 07:14:21.314527 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:21.314459 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" event={"ID":"30a3245f-2a4e-4c26-8b51-a76e2c24331f","Type":"ContainerDied","Data":"1bf799917ef2f5264f39f566b64809d4ccb243b8c5a239a505048fa53afefb08"} Apr 20 07:14:22.320930 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.320883 2566 generic.go:358] "Generic (PLEG): container finished" podID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerID="875f0d6f505debe6b305284b23bb8b23bc3c8ec05afa5d3cb47d29935d422014" exitCode=0 Apr 20 07:14:22.321337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.320961 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" event={"ID":"ec28d2fc-0988-4fc1-8296-e35886f33ba9","Type":"ContainerDied","Data":"875f0d6f505debe6b305284b23bb8b23bc3c8ec05afa5d3cb47d29935d422014"} Apr 20 07:14:22.322785 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.322764 2566 generic.go:358] "Generic (PLEG): container finished" podID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerID="61ed5c340eb838ce0bd273bc71f6327532efbda48e4b931268d9cd9e0ff0b088" exitCode=0 Apr 20 07:14:22.322887 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.322846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" event={"ID":"f84adcb3-4653-4722-8577-04d06c6c5a59","Type":"ContainerDied","Data":"61ed5c340eb838ce0bd273bc71f6327532efbda48e4b931268d9cd9e0ff0b088"} Apr 20 07:14:22.324687 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.324650 2566 generic.go:358] "Generic (PLEG): container finished" podID="72c4f55e-7867-4831-b766-d0feea542b63" containerID="987ab5df64a6099c29d8f6b393b770f0fa274b6a98a4a49ac1a033513fef4c1c" exitCode=0 Apr 20 07:14:22.324787 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.324686 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" event={"ID":"72c4f55e-7867-4831-b766-d0feea542b63","Type":"ContainerDied","Data":"987ab5df64a6099c29d8f6b393b770f0fa274b6a98a4a49ac1a033513fef4c1c"} Apr 20 07:14:22.459793 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.459773 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:22.516761 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.516729 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util\") pod \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " Apr 20 07:14:22.516929 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.516784 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssxh\" (UniqueName: \"kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh\") pod \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " Apr 20 07:14:22.516995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.516942 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle\") pod \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\" (UID: \"30a3245f-2a4e-4c26-8b51-a76e2c24331f\") " Apr 20 07:14:22.517507 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.517470 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle" (OuterVolumeSpecName: "bundle") pod "30a3245f-2a4e-4c26-8b51-a76e2c24331f" (UID: "30a3245f-2a4e-4c26-8b51-a76e2c24331f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:22.519204 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.519180 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh" (OuterVolumeSpecName: "kube-api-access-jssxh") pod "30a3245f-2a4e-4c26-8b51-a76e2c24331f" (UID: "30a3245f-2a4e-4c26-8b51-a76e2c24331f"). InnerVolumeSpecName "kube-api-access-jssxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:22.523584 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.523555 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util" (OuterVolumeSpecName: "util") pod "30a3245f-2a4e-4c26-8b51-a76e2c24331f" (UID: "30a3245f-2a4e-4c26-8b51-a76e2c24331f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:22.617843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.617768 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:22.617843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.617792 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30a3245f-2a4e-4c26-8b51-a76e2c24331f-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:22.617843 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:22.617803 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jssxh\" (UniqueName: \"kubernetes.io/projected/30a3245f-2a4e-4c26-8b51-a76e2c24331f-kube-api-access-jssxh\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.331341 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.331305 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" Apr 20 07:14:23.331716 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.331308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb" event={"ID":"30a3245f-2a4e-4c26-8b51-a76e2c24331f","Type":"ContainerDied","Data":"da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7"} Apr 20 07:14:23.331716 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.331440 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1d2ea8d9e452431c72c025c80affe6b321025d2f342ef8fe3b11899f199ac7" Apr 20 07:14:23.485471 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.485446 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:23.520923 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.520904 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:23.524133 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.524108 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util\") pod \"72c4f55e-7867-4831-b766-d0feea542b63\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " Apr 20 07:14:23.524261 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.524151 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgtmq\" (UniqueName: \"kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq\") pod \"72c4f55e-7867-4831-b766-d0feea542b63\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " Apr 20 07:14:23.524261 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.524233 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle\") pod \"72c4f55e-7867-4831-b766-d0feea542b63\" (UID: \"72c4f55e-7867-4831-b766-d0feea542b63\") " Apr 20 07:14:23.524717 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.524696 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:23.524822 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.524792 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle" (OuterVolumeSpecName: "bundle") pod "72c4f55e-7867-4831-b766-d0feea542b63" (UID: "72c4f55e-7867-4831-b766-d0feea542b63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.526722 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.526698 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq" (OuterVolumeSpecName: "kube-api-access-rgtmq") pod "72c4f55e-7867-4831-b766-d0feea542b63" (UID: "72c4f55e-7867-4831-b766-d0feea542b63"). InnerVolumeSpecName "kube-api-access-rgtmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:23.529530 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.529506 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util" (OuterVolumeSpecName: "util") pod "72c4f55e-7867-4831-b766-d0feea542b63" (UID: "72c4f55e-7867-4831-b766-d0feea542b63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.625348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625261 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fl4n\" (UniqueName: \"kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n\") pod \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " Apr 20 07:14:23.625348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625306 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util\") pod \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " Apr 20 07:14:23.625563 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625359 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle\") pod \"f84adcb3-4653-4722-8577-04d06c6c5a59\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " Apr 20 07:14:23.625563 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625410 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle\") pod \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\" (UID: \"ec28d2fc-0988-4fc1-8296-e35886f33ba9\") " Apr 20 07:14:23.625563 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625460 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util\") pod \"f84adcb3-4653-4722-8577-04d06c6c5a59\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " Apr 20 07:14:23.625563 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625498 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bnm6\" (UniqueName: \"kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6\") pod \"f84adcb3-4653-4722-8577-04d06c6c5a59\" (UID: \"f84adcb3-4653-4722-8577-04d06c6c5a59\") " Apr 20 07:14:23.625851 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625723 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.625851 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625740 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgtmq\" (UniqueName: \"kubernetes.io/projected/72c4f55e-7867-4831-b766-d0feea542b63-kube-api-access-rgtmq\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.625851 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625756 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72c4f55e-7867-4831-b766-d0feea542b63-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.626166 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625849 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle" (OuterVolumeSpecName: "bundle") pod "f84adcb3-4653-4722-8577-04d06c6c5a59" (UID: "f84adcb3-4653-4722-8577-04d06c6c5a59"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.626166 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.625945 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle" (OuterVolumeSpecName: "bundle") pod "ec28d2fc-0988-4fc1-8296-e35886f33ba9" (UID: "ec28d2fc-0988-4fc1-8296-e35886f33ba9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.627668 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.627642 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n" (OuterVolumeSpecName: "kube-api-access-4fl4n") pod "ec28d2fc-0988-4fc1-8296-e35886f33ba9" (UID: "ec28d2fc-0988-4fc1-8296-e35886f33ba9"). InnerVolumeSpecName "kube-api-access-4fl4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:23.628031 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.628013 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6" (OuterVolumeSpecName: "kube-api-access-5bnm6") pod "f84adcb3-4653-4722-8577-04d06c6c5a59" (UID: "f84adcb3-4653-4722-8577-04d06c6c5a59"). InnerVolumeSpecName "kube-api-access-5bnm6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:23.631007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.630986 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util" (OuterVolumeSpecName: "util") pod "ec28d2fc-0988-4fc1-8296-e35886f33ba9" (UID: "ec28d2fc-0988-4fc1-8296-e35886f33ba9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.631350 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.631331 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util" (OuterVolumeSpecName: "util") pod "f84adcb3-4653-4722-8577-04d06c6c5a59" (UID: "f84adcb3-4653-4722-8577-04d06c6c5a59"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:23.727102 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727051 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.727102 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727098 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.727267 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727109 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bnm6\" (UniqueName: \"kubernetes.io/projected/f84adcb3-4653-4722-8577-04d06c6c5a59-kube-api-access-5bnm6\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.727267 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727118 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fl4n\" (UniqueName: \"kubernetes.io/projected/ec28d2fc-0988-4fc1-8296-e35886f33ba9-kube-api-access-4fl4n\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.727267 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727127 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec28d2fc-0988-4fc1-8296-e35886f33ba9-util\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:23.727267 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:23.727137 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f84adcb3-4653-4722-8577-04d06c6c5a59-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:24.337028 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.336995 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" Apr 20 07:14:24.337500 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.336997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96" event={"ID":"f84adcb3-4653-4722-8577-04d06c6c5a59","Type":"ContainerDied","Data":"8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad"} Apr 20 07:14:24.337500 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.337123 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ced3fa065ad19fee009d43f09f53d296f459f870dc5f1a4631b615f396a46ad" Apr 20 07:14:24.338874 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.338848 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" Apr 20 07:14:24.338992 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.338845 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt" event={"ID":"72c4f55e-7867-4831-b766-d0feea542b63","Type":"ContainerDied","Data":"fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579"} Apr 20 07:14:24.338992 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.338984 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab96fb03f823037a23d0c47a7d43a0a7665bb4f737e4142ebdfe3e79d574579" Apr 20 07:14:24.340783 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.340748 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" event={"ID":"ec28d2fc-0988-4fc1-8296-e35886f33ba9","Type":"ContainerDied","Data":"a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41"} Apr 20 07:14:24.340783 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.340770 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp" Apr 20 07:14:24.340783 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:24.340778 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7aa9c98f7a1a6e081e6561f06b09184cba8fa6508da8b55f74a5099df33cb41" Apr 20 07:14:29.815426 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:29.815385 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:29.815900 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:29.815441 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:29.820234 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:29.820209 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:30.371463 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.371436 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5776bd786f-mftdz" Apr 20 07:14:30.449769 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.449728 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:14:30.719339 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719251 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2"] Apr 20 07:14:30.719689 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719672 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="pull" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719691 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="pull" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719705 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="util" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719713 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="util" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719722 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="util" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719732 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="util" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719744 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="pull" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719752 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="pull" Apr 20 07:14:30.719770 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719766 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719775 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719791 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="util" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719800 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="util" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719814 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="util" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719821 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="util" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719836 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719845 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719858 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="pull" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719865 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="pull" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719874 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719882 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719894 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="pull" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719903 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="pull" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719911 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.719919 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.720038 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="72c4f55e-7867-4831-b766-d0feea542b63" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.720076 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="30a3245f-2a4e-4c26-8b51-a76e2c24331f" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.720089 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec28d2fc-0988-4fc1-8296-e35886f33ba9" containerName="extract" Apr 20 07:14:30.720170 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.720098 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f84adcb3-4653-4722-8577-04d06c6c5a59" containerName="extract" Apr 20 07:14:30.723483 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.723463 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.726560 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.726535 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:14:30.726677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.726586 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:14:30.726677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.726625 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6mqtq\"" Apr 20 07:14:30.736005 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.735979 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2"] Apr 20 07:14:30.787820 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.787787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.788141 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.787829 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgt7\" (UniqueName: \"kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.888792 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.888756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.889243 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.888806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgt7\" (UniqueName: \"kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.889243 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.889191 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:30.900334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:30.900307 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgt7\" (UniqueName: \"kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:31.034854 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:31.034823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:31.172622 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:31.172597 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2"] Apr 20 07:14:31.175726 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:31.175697 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e2f671_886c_4201_84b2_583bdec6e0b7.slice/crio-f01837316430d50d188a3468bf6c622937a54cdc109d5f84ca4bae966f4f920a WatchSource:0}: Error finding container f01837316430d50d188a3468bf6c622937a54cdc109d5f84ca4bae966f4f920a: Status 404 returned error can't find the container with id f01837316430d50d188a3468bf6c622937a54cdc109d5f84ca4bae966f4f920a Apr 20 07:14:31.372045 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:31.371960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" event={"ID":"c4e2f671-886c-4201-84b2-583bdec6e0b7","Type":"ContainerStarted","Data":"f01837316430d50d188a3468bf6c622937a54cdc109d5f84ca4bae966f4f920a"} Apr 20 07:14:35.260131 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.260094 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr"] Apr 20 07:14:35.263569 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.263551 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:35.266203 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.266180 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 07:14:35.266203 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.266189 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-h629g\"" Apr 20 07:14:35.275612 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.275330 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr"] Apr 20 07:14:35.330377 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.330332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf2lw\" (UniqueName: \"kubernetes.io/projected/90691418-d1c6-4312-8fb7-c22dd32b73a8-kube-api-access-jf2lw\") pod \"dns-operator-controller-manager-648d5c98bc-9bqnr\" (UID: \"90691418-d1c6-4312-8fb7-c22dd32b73a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:35.431603 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.431560 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf2lw\" (UniqueName: \"kubernetes.io/projected/90691418-d1c6-4312-8fb7-c22dd32b73a8-kube-api-access-jf2lw\") pod \"dns-operator-controller-manager-648d5c98bc-9bqnr\" (UID: \"90691418-d1c6-4312-8fb7-c22dd32b73a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:35.444055 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.444018 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf2lw\" (UniqueName: \"kubernetes.io/projected/90691418-d1c6-4312-8fb7-c22dd32b73a8-kube-api-access-jf2lw\") pod \"dns-operator-controller-manager-648d5c98bc-9bqnr\" (UID: \"90691418-d1c6-4312-8fb7-c22dd32b73a8\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:35.578385 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:35.578296 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:36.655120 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:36.655093 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr"] Apr 20 07:14:36.658441 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:36.658411 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90691418_d1c6_4312_8fb7_c22dd32b73a8.slice/crio-d5e34605d69374bd0898560158b50e01d8ac4d19dafb0496fe6c02c742351f87 WatchSource:0}: Error finding container d5e34605d69374bd0898560158b50e01d8ac4d19dafb0496fe6c02c742351f87: Status 404 returned error can't find the container with id d5e34605d69374bd0898560158b50e01d8ac4d19dafb0496fe6c02c742351f87 Apr 20 07:14:37.403334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.403292 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" event={"ID":"c4e2f671-886c-4201-84b2-583bdec6e0b7","Type":"ContainerStarted","Data":"1fa5de976465d98eaa931c79efcd25114f112aef1b1535ed6534c2b88f5321b2"} Apr 20 07:14:37.403511 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.403444 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:37.404838 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.404805 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" event={"ID":"90691418-d1c6-4312-8fb7-c22dd32b73a8","Type":"ContainerStarted","Data":"d5e34605d69374bd0898560158b50e01d8ac4d19dafb0496fe6c02c742351f87"} Apr 20 07:14:37.428150 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.428089 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" podStartSLOduration=2.022278468 podStartE2EDuration="7.42804775s" podCreationTimestamp="2026-04-20 07:14:30 +0000 UTC" firstStartedPulling="2026-04-20 07:14:31.177971688 +0000 UTC m=+713.821280816" lastFinishedPulling="2026-04-20 07:14:36.583740948 +0000 UTC m=+719.227050098" observedRunningTime="2026-04-20 07:14:37.423891809 +0000 UTC m=+720.067200959" watchObservedRunningTime="2026-04-20 07:14:37.42804775 +0000 UTC m=+720.071356900" Apr 20 07:14:37.879906 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.879868 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd"] Apr 20 07:14:37.883918 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.883891 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:37.886872 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.886654 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-wjww9\"" Apr 20 07:14:37.896995 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.896961 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd"] Apr 20 07:14:37.954598 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:37.954566 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79tb\" (UniqueName: \"kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb\") pod \"limitador-operator-controller-manager-85c4996f8c-87pwd\" (UID: \"53714b8f-3281-41f9-9bac-291f3de9acbe\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:38.055620 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:38.055582 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s79tb\" (UniqueName: \"kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb\") pod \"limitador-operator-controller-manager-85c4996f8c-87pwd\" (UID: \"53714b8f-3281-41f9-9bac-291f3de9acbe\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:38.071342 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:38.071309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79tb\" (UniqueName: \"kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb\") pod \"limitador-operator-controller-manager-85c4996f8c-87pwd\" (UID: \"53714b8f-3281-41f9-9bac-291f3de9acbe\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:38.200078 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:38.199979 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:38.594932 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:38.594907 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 07:14:38.676456 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:38.676363 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd"] Apr 20 07:14:38.677931 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:38.677903 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53714b8f_3281_41f9_9bac_291f3de9acbe.slice/crio-3a04930a86cf13b9318531819394f124bdb62d4fa0c3c2d59d67b32a850830d2 WatchSource:0}: Error finding container 3a04930a86cf13b9318531819394f124bdb62d4fa0c3c2d59d67b32a850830d2: Status 404 returned error can't find the container with id 3a04930a86cf13b9318531819394f124bdb62d4fa0c3c2d59d67b32a850830d2 Apr 20 07:14:39.415890 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:39.415852 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" event={"ID":"53714b8f-3281-41f9-9bac-291f3de9acbe","Type":"ContainerStarted","Data":"3a04930a86cf13b9318531819394f124bdb62d4fa0c3c2d59d67b32a850830d2"} Apr 20 07:14:39.417484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:39.417454 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" event={"ID":"90691418-d1c6-4312-8fb7-c22dd32b73a8","Type":"ContainerStarted","Data":"bcefe0500319d6c2d6519f72b9ea03e94287244162a1a1466b58a1ee3e030db1"} Apr 20 07:14:39.417614 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:39.417576 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:39.438604 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:39.438552 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" podStartSLOduration=2.507232611 podStartE2EDuration="4.43853309s" podCreationTimestamp="2026-04-20 07:14:35 +0000 UTC" firstStartedPulling="2026-04-20 07:14:36.660598284 +0000 UTC m=+719.303907432" lastFinishedPulling="2026-04-20 07:14:38.591898781 +0000 UTC m=+721.235207911" observedRunningTime="2026-04-20 07:14:39.437265461 +0000 UTC m=+722.080574613" watchObservedRunningTime="2026-04-20 07:14:39.43853309 +0000 UTC m=+722.081842239" Apr 20 07:14:40.422574 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:40.422525 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" event={"ID":"53714b8f-3281-41f9-9bac-291f3de9acbe","Type":"ContainerStarted","Data":"85b1d5b3bae8b66cd78151d33588999c0fb6491b7b4671ba7706c9e0673f59df"} Apr 20 07:14:41.426456 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:41.426427 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:41.446132 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:41.446083 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" podStartSLOduration=2.7717952 podStartE2EDuration="4.446042468s" podCreationTimestamp="2026-04-20 07:14:37 +0000 UTC" firstStartedPulling="2026-04-20 07:14:38.679875565 +0000 UTC m=+721.323184694" lastFinishedPulling="2026-04-20 07:14:40.35412283 +0000 UTC m=+722.997431962" observedRunningTime="2026-04-20 07:14:41.443528563 +0000 UTC m=+724.086837716" watchObservedRunningTime="2026-04-20 07:14:41.446042468 +0000 UTC m=+724.089351619" Apr 20 07:14:48.412352 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:48.412322 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:50.323681 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.323641 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2"] Apr 20 07:14:50.324269 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.323952 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" containerName="manager" containerID="cri-o://1fa5de976465d98eaa931c79efcd25114f112aef1b1535ed6534c2b88f5321b2" gracePeriod=2 Apr 20 07:14:50.326438 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.326397 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.330119 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.330087 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2"] Apr 20 07:14:50.341127 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.341093 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd"] Apr 20 07:14:50.341465 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.341409 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" containerName="manager" containerID="cri-o://85b1d5b3bae8b66cd78151d33588999c0fb6491b7b4671ba7706c9e0673f59df" gracePeriod=2 Apr 20 07:14:50.344572 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.343751 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:50.344709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.344650 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.347008 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.346971 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.349448 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.349179 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.350105 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.350044 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd"] Apr 20 07:14:50.351196 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.351168 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.352732 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.352713 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:14:50.353136 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353122 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" containerName="manager" Apr 20 07:14:50.353185 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353138 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" containerName="manager" Apr 20 07:14:50.353185 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353151 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" containerName="manager" Apr 20 07:14:50.353185 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353158 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" containerName="manager" Apr 20 07:14:50.353276 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353225 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" containerName="manager" Apr 20 07:14:50.353276 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.353237 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" containerName="manager" Apr 20 07:14:50.356384 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.356366 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.359385 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.359322 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.378491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.378453 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:14:50.381093 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.381045 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc"] Apr 20 07:14:50.384739 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.384714 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:50.399001 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.398949 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc"] Apr 20 07:14:50.400517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.400488 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.405742 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.405712 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.424991 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.424963 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9bqnr" Apr 20 07:14:50.451938 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.451904 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.466968 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.466935 2566 generic.go:358] "Generic (PLEG): container finished" podID="53714b8f-3281-41f9-9bac-291f3de9acbe" containerID="85b1d5b3bae8b66cd78151d33588999c0fb6491b7b4671ba7706c9e0673f59df" exitCode=0 Apr 20 07:14:50.469423 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.469388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.469653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.469455 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpm67\" (UniqueName: \"kubernetes.io/projected/0482519c-872b-48e1-bb33-9e50309c461e-kube-api-access-cpm67\") pod \"limitador-operator-controller-manager-85c4996f8c-9zjjc\" (UID: \"0482519c-872b-48e1-bb33-9e50309c461e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:50.469653 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.469495 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.469903 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.469878 2566 generic.go:358] "Generic (PLEG): container finished" podID="c4e2f671-886c-4201-84b2-583bdec6e0b7" containerID="1fa5de976465d98eaa931c79efcd25114f112aef1b1535ed6534c2b88f5321b2" exitCode=0 Apr 20 07:14:50.491689 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.491586 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.496425 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.496386 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.570968 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.570927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.571206 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.570998 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpm67\" (UniqueName: \"kubernetes.io/projected/0482519c-872b-48e1-bb33-9e50309c461e-kube-api-access-cpm67\") pod \"limitador-operator-controller-manager-85c4996f8c-9zjjc\" (UID: \"0482519c-872b-48e1-bb33-9e50309c461e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:50.571206 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.571041 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.571503 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.571480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.595339 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.595313 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpm67\" (UniqueName: \"kubernetes.io/projected/0482519c-872b-48e1-bb33-9e50309c461e-kube-api-access-cpm67\") pod \"limitador-operator-controller-manager-85c4996f8c-9zjjc\" (UID: \"0482519c-872b-48e1-bb33-9e50309c461e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:50.597189 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.597171 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:50.600354 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.600325 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.603012 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.602971 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.607737 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.607717 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:50.612019 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.611986 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.612232 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.612215 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b9mw8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.614721 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.614697 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:50.672350 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.672291 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume\") pod \"c4e2f671-886c-4201-84b2-583bdec6e0b7\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " Apr 20 07:14:50.672510 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.672393 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qgt7\" (UniqueName: \"kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7\") pod \"c4e2f671-886c-4201-84b2-583bdec6e0b7\" (UID: \"c4e2f671-886c-4201-84b2-583bdec6e0b7\") " Apr 20 07:14:50.672657 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.672635 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c4e2f671-886c-4201-84b2-583bdec6e0b7" (UID: "c4e2f671-886c-4201-84b2-583bdec6e0b7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:14:50.674879 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.674848 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7" (OuterVolumeSpecName: "kube-api-access-9qgt7") pod "c4e2f671-886c-4201-84b2-583bdec6e0b7" (UID: "c4e2f671-886c-4201-84b2-583bdec6e0b7"). InnerVolumeSpecName "kube-api-access-9qgt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:50.751016 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.750967 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:50.758967 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.758934 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:50.774381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.774343 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79tb\" (UniqueName: \"kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb\") pod \"53714b8f-3281-41f9-9bac-291f3de9acbe\" (UID: \"53714b8f-3281-41f9-9bac-291f3de9acbe\") " Apr 20 07:14:50.774609 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.774588 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qgt7\" (UniqueName: \"kubernetes.io/projected/c4e2f671-886c-4201-84b2-583bdec6e0b7-kube-api-access-9qgt7\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:50.774670 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.774613 2566 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c4e2f671-886c-4201-84b2-583bdec6e0b7-extensions-socket-volume\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:50.776560 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.776532 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb" (OuterVolumeSpecName: "kube-api-access-s79tb") pod "53714b8f-3281-41f9-9bac-291f3de9acbe" (UID: "53714b8f-3281-41f9-9bac-291f3de9acbe"). InnerVolumeSpecName "kube-api-access-s79tb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:50.875791 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.875757 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s79tb\" (UniqueName: \"kubernetes.io/projected/53714b8f-3281-41f9-9bac-291f3de9acbe-kube-api-access-s79tb\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:50.899921 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.899890 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:14:50.903201 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:50.903171 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32d725ec_7e49_471f_a6a2_f982cc9b33a8.slice/crio-302081bfee4b0fef012ba70ba4387953b594285633232a5c2f0d52fef69ffd49 WatchSource:0}: Error finding container 302081bfee4b0fef012ba70ba4387953b594285633232a5c2f0d52fef69ffd49: Status 404 returned error can't find the container with id 302081bfee4b0fef012ba70ba4387953b594285633232a5c2f0d52fef69ffd49 Apr 20 07:14:50.935240 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:50.935214 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc"] Apr 20 07:14:50.942293 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:50.942252 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0482519c_872b_48e1_bb33_9e50309c461e.slice/crio-75c19ef628cc070e7e9ec5ea8287c28f4565fe525b8bb97ea4cbdbe1a340eba4 WatchSource:0}: Error finding container 75c19ef628cc070e7e9ec5ea8287c28f4565fe525b8bb97ea4cbdbe1a340eba4: Status 404 returned error can't find the container with id 75c19ef628cc070e7e9ec5ea8287c28f4565fe525b8bb97ea4cbdbe1a340eba4 Apr 20 07:14:51.474644 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.474605 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" Apr 20 07:14:51.475137 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.474654 2566 scope.go:117] "RemoveContainer" containerID="1fa5de976465d98eaa931c79efcd25114f112aef1b1535ed6534c2b88f5321b2" Apr 20 07:14:51.476226 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.476191 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" event={"ID":"0482519c-872b-48e1-bb33-9e50309c461e","Type":"ContainerStarted","Data":"c6bfc0443051be697eee96f99476436de5e0c9b5b7b9b7005cf1db6365db7e96"} Apr 20 07:14:51.476348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.476235 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" event={"ID":"0482519c-872b-48e1-bb33-9e50309c461e","Type":"ContainerStarted","Data":"75c19ef628cc070e7e9ec5ea8287c28f4565fe525b8bb97ea4cbdbe1a340eba4"} Apr 20 07:14:51.476348 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.476324 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:14:51.477455 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.477428 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.477983 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.477955 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" event={"ID":"32d725ec-7e49-471f-a6a2-f982cc9b33a8","Type":"ContainerStarted","Data":"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409"} Apr 20 07:14:51.478118 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.477993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" event={"ID":"32d725ec-7e49-471f-a6a2-f982cc9b33a8","Type":"ContainerStarted","Data":"302081bfee4b0fef012ba70ba4387953b594285633232a5c2f0d52fef69ffd49"} Apr 20 07:14:51.478118 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.478047 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:14:51.479366 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.479345 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" Apr 20 07:14:51.479901 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.479875 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.482171 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.482139 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.485154 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.485135 2566 scope.go:117] "RemoveContainer" containerID="85b1d5b3bae8b66cd78151d33588999c0fb6491b7b4671ba7706c9e0673f59df" Apr 20 07:14:51.485681 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.485645 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.507962 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.507898 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" podStartSLOduration=1.5078823300000002 podStartE2EDuration="1.50788233s" podCreationTimestamp="2026-04-20 07:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:14:51.50440084 +0000 UTC m=+734.147709990" watchObservedRunningTime="2026-04-20 07:14:51.50788233 +0000 UTC m=+734.151191479" Apr 20 07:14:51.527271 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.527220 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" podStartSLOduration=1.5272069190000002 podStartE2EDuration="1.527206919s" podCreationTimestamp="2026-04-20 07:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:14:51.525207352 +0000 UTC m=+734.168516502" watchObservedRunningTime="2026-04-20 07:14:51.527206919 +0000 UTC m=+734.170516069" Apr 20 07:14:51.528559 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.528525 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.530940 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.530910 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.846799 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.846757 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:14:51.851885 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.851856 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:51.855687 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.855649 2566 status_manager.go:895] "Failed to get status for pod" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-87pwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-87pwd\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.857772 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.857737 2566 status_manager.go:895] "Failed to get status for pod" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kdlh2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-kdlh2\" is forbidden: User \"system:node:ip-10-0-142-100.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-100.ec2.internal' and this object" Apr 20 07:14:51.864100 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.864052 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:14:51.946449 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.946410 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53714b8f-3281-41f9-9bac-291f3de9acbe" path="/var/lib/kubelet/pods/53714b8f-3281-41f9-9bac-291f3de9acbe/volumes" Apr 20 07:14:51.946773 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.946760 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e2f671-886c-4201-84b2-583bdec6e0b7" path="/var/lib/kubelet/pods/c4e2f671-886c-4201-84b2-583bdec6e0b7/volumes" Apr 20 07:14:51.983866 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.983829 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:51.984035 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:51.983967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7rn\" (UniqueName: \"kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.085473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.085433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7rn\" (UniqueName: \"kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.085657 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.085493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.085856 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.085839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.094667 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.094625 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7rn\" (UniqueName: \"kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-6xlcv\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.162594 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.162490 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.302725 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.302691 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:14:52.304990 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:14:52.304962 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff9d4a5_03a9_4201_89c5_48ea308a80f2.slice/crio-59e48d82e58544c883ebd65803fd14a832f1abf36d35ca9bc32e5ce6f14d7fd9 WatchSource:0}: Error finding container 59e48d82e58544c883ebd65803fd14a832f1abf36d35ca9bc32e5ce6f14d7fd9: Status 404 returned error can't find the container with id 59e48d82e58544c883ebd65803fd14a832f1abf36d35ca9bc32e5ce6f14d7fd9 Apr 20 07:14:52.485988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.485875 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" event={"ID":"fff9d4a5-03a9-4201-89c5-48ea308a80f2","Type":"ContainerStarted","Data":"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0"} Apr 20 07:14:52.485988 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.485918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" event={"ID":"fff9d4a5-03a9-4201-89c5-48ea308a80f2","Type":"ContainerStarted","Data":"59e48d82e58544c883ebd65803fd14a832f1abf36d35ca9bc32e5ce6f14d7fd9"} Apr 20 07:14:52.486526 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.485990 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:14:52.506249 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:52.506190 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" podStartSLOduration=1.506174194 podStartE2EDuration="1.506174194s" podCreationTimestamp="2026-04-20 07:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:14:52.50539508 +0000 UTC m=+735.148704230" watchObservedRunningTime="2026-04-20 07:14:52.506174194 +0000 UTC m=+735.149483344" Apr 20 07:14:55.471866 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.471817 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d8f479686-jmvqh" podUID="43716734-2ee4-4d8d-a59f-7acea7d37be7" containerName="console" containerID="cri-o://b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11" gracePeriod=15 Apr 20 07:14:55.732620 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.732554 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8f479686-jmvqh_43716734-2ee4-4d8d-a59f-7acea7d37be7/console/0.log" Apr 20 07:14:55.732747 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.732623 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:14:55.816964 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.816921 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.816993 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817080 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817163 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817121 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817174 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817216 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tx6g\" (UniqueName: \"kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817330 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817259 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config\") pod \"43716734-2ee4-4d8d-a59f-7acea7d37be7\" (UID: \"43716734-2ee4-4d8d-a59f-7acea7d37be7\") " Apr 20 07:14:55.817518 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817488 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:14:55.817580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817497 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca" (OuterVolumeSpecName: "service-ca") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:14:55.817580 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817556 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:14:55.818074 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.817978 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config" (OuterVolumeSpecName: "console-config") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:14:55.819517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.819487 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:14:55.819607 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.819510 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:14:55.819607 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.819529 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g" (OuterVolumeSpecName: "kube-api-access-8tx6g") pod "43716734-2ee4-4d8d-a59f-7acea7d37be7" (UID: "43716734-2ee4-4d8d-a59f-7acea7d37be7"). InnerVolumeSpecName "kube-api-access-8tx6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:55.918391 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918351 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-trusted-ca-bundle\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918391 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918385 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-service-ca\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918391 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918395 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918404 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8tx6g\" (UniqueName: \"kubernetes.io/projected/43716734-2ee4-4d8d-a59f-7acea7d37be7-kube-api-access-8tx6g\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918413 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918423 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43716734-2ee4-4d8d-a59f-7acea7d37be7-console-oauth-config\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:55.918630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:55.918431 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43716734-2ee4-4d8d-a59f-7acea7d37be7-oauth-serving-cert\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:14:56.505506 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505472 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8f479686-jmvqh_43716734-2ee4-4d8d-a59f-7acea7d37be7/console/0.log" Apr 20 07:14:56.505904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505517 2566 generic.go:358] "Generic (PLEG): container finished" podID="43716734-2ee4-4d8d-a59f-7acea7d37be7" containerID="b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11" exitCode=2 Apr 20 07:14:56.505904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505555 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8f479686-jmvqh" event={"ID":"43716734-2ee4-4d8d-a59f-7acea7d37be7","Type":"ContainerDied","Data":"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11"} Apr 20 07:14:56.505904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505586 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8f479686-jmvqh" Apr 20 07:14:56.505904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505595 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8f479686-jmvqh" event={"ID":"43716734-2ee4-4d8d-a59f-7acea7d37be7","Type":"ContainerDied","Data":"fb20eb047760424977c53976d5a20293f61b9cdb6dd91e993e99bcb4ad925ddc"} Apr 20 07:14:56.505904 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.505615 2566 scope.go:117] "RemoveContainer" containerID="b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11" Apr 20 07:14:56.514782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.514759 2566 scope.go:117] "RemoveContainer" containerID="b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11" Apr 20 07:14:56.515112 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:14:56.515089 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11\": container with ID starting with b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11 not found: ID does not exist" containerID="b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11" Apr 20 07:14:56.515165 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.515124 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11"} err="failed to get container status \"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11\": rpc error: code = NotFound desc = could not find container \"b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11\": container with ID starting with b9a423ba6890bb808db37da3c71623780e2503c6a52718b54ef791f538281d11 not found: ID does not exist" Apr 20 07:14:56.524555 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.524525 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:14:56.529728 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:56.529704 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d8f479686-jmvqh"] Apr 20 07:14:57.940922 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:14:57.940877 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43716734-2ee4-4d8d-a59f-7acea7d37be7" path="/var/lib/kubelet/pods/43716734-2ee4-4d8d-a59f-7acea7d37be7/volumes" Apr 20 07:15:02.488938 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:02.488893 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9zjjc" Apr 20 07:15:02.489381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:02.489120 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:15:03.493776 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.493743 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:15:03.552095 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.550532 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:15:03.552095 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.550895 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" podUID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" containerName="manager" containerID="cri-o://b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409" gracePeriod=10 Apr 20 07:15:03.798630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.798603 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:15:03.990696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.990658 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume\") pod \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " Apr 20 07:15:03.990696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.990708 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf\") pod \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\" (UID: \"32d725ec-7e49-471f-a6a2-f982cc9b33a8\") " Apr 20 07:15:03.991140 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.991114 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "32d725ec-7e49-471f-a6a2-f982cc9b33a8" (UID: "32d725ec-7e49-471f-a6a2-f982cc9b33a8"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:15:03.993077 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:03.993038 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf" (OuterVolumeSpecName: "kube-api-access-gzdzf") pod "32d725ec-7e49-471f-a6a2-f982cc9b33a8" (UID: "32d725ec-7e49-471f-a6a2-f982cc9b33a8"). InnerVolumeSpecName "kube-api-access-gzdzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:15:04.091753 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.091642 2566 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32d725ec-7e49-471f-a6a2-f982cc9b33a8-extensions-socket-volume\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:15:04.091753 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.091681 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/32d725ec-7e49-471f-a6a2-f982cc9b33a8-kube-api-access-gzdzf\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:15:04.548041 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.548006 2566 generic.go:358] "Generic (PLEG): container finished" podID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" containerID="b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409" exitCode=0 Apr 20 07:15:04.548475 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.548084 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" Apr 20 07:15:04.548475 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.548099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" event={"ID":"32d725ec-7e49-471f-a6a2-f982cc9b33a8","Type":"ContainerDied","Data":"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409"} Apr 20 07:15:04.548475 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.548142 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8" event={"ID":"32d725ec-7e49-471f-a6a2-f982cc9b33a8","Type":"ContainerDied","Data":"302081bfee4b0fef012ba70ba4387953b594285633232a5c2f0d52fef69ffd49"} Apr 20 07:15:04.548475 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.548158 2566 scope.go:117] "RemoveContainer" containerID="b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409" Apr 20 07:15:04.557663 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.557491 2566 scope.go:117] "RemoveContainer" containerID="b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409" Apr 20 07:15:04.557823 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:15:04.557803 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409\": container with ID starting with b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409 not found: ID does not exist" containerID="b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409" Apr 20 07:15:04.557893 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.557836 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409"} err="failed to get container status \"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409\": rpc error: code = NotFound desc = could not find container \"b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409\": container with ID starting with b2d3a2e2ecac50757ea090e160963e551bf3b99a8be5dacdfca61b6389133409 not found: ID does not exist" Apr 20 07:15:04.573657 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.572523 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:15:04.578006 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:04.577970 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b9mw8"] Apr 20 07:15:05.941219 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:05.941184 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" path="/var/lib/kubelet/pods/32d725ec-7e49-471f-a6a2-f982cc9b33a8/volumes" Apr 20 07:15:45.330375 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330337 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-lhn74"] Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330736 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43716734-2ee4-4d8d-a59f-7acea7d37be7" containerName="console" Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330749 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="43716734-2ee4-4d8d-a59f-7acea7d37be7" containerName="console" Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330756 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" containerName="manager" Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330762 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" containerName="manager" Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330843 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="43716734-2ee4-4d8d-a59f-7acea7d37be7" containerName="console" Apr 20 07:15:45.330869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.330853 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="32d725ec-7e49-471f-a6a2-f982cc9b33a8" containerName="manager" Apr 20 07:15:45.335630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.335602 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.342569 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.342544 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-665h2\"" Apr 20 07:15:45.342696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.342675 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 07:15:45.354520 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.354485 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-lhn74"] Apr 20 07:15:45.450725 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.450685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/34e9cd18-02fc-4d57-aca9-ce9a644d737c-data\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.450916 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.450751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4fk\" (UniqueName: \"kubernetes.io/projected/34e9cd18-02fc-4d57-aca9-ce9a644d737c-kube-api-access-qz4fk\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.551455 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.551418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/34e9cd18-02fc-4d57-aca9-ce9a644d737c-data\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.551709 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.551470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4fk\" (UniqueName: \"kubernetes.io/projected/34e9cd18-02fc-4d57-aca9-ce9a644d737c-kube-api-access-qz4fk\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.551828 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.551805 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/34e9cd18-02fc-4d57-aca9-ce9a644d737c-data\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.560445 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.560409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4fk\" (UniqueName: \"kubernetes.io/projected/34e9cd18-02fc-4d57-aca9-ce9a644d737c-kube-api-access-qz4fk\") pod \"postgres-868db5846d-lhn74\" (UID: \"34e9cd18-02fc-4d57-aca9-ce9a644d737c\") " pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.646423 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.646334 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:45.781378 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:45.781352 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-lhn74"] Apr 20 07:15:45.783954 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:15:45.783917 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e9cd18_02fc_4d57_aca9_ce9a644d737c.slice/crio-3eee5580037c8d7e2fbad94dc0981e4d6cffe56717cdcb1dae6cdf379fcb8db5 WatchSource:0}: Error finding container 3eee5580037c8d7e2fbad94dc0981e4d6cffe56717cdcb1dae6cdf379fcb8db5: Status 404 returned error can't find the container with id 3eee5580037c8d7e2fbad94dc0981e4d6cffe56717cdcb1dae6cdf379fcb8db5 Apr 20 07:15:46.736764 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:46.736722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-lhn74" event={"ID":"34e9cd18-02fc-4d57-aca9-ce9a644d737c","Type":"ContainerStarted","Data":"3eee5580037c8d7e2fbad94dc0981e4d6cffe56717cdcb1dae6cdf379fcb8db5"} Apr 20 07:15:51.764223 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:51.764180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-lhn74" event={"ID":"34e9cd18-02fc-4d57-aca9-ce9a644d737c","Type":"ContainerStarted","Data":"f1b3155b599e27792604d04b22597e8c1d72a98e3450740b633113ec3a6b7e3d"} Apr 20 07:15:51.764624 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:51.764325 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:15:51.781552 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:51.781506 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-lhn74" podStartSLOduration=1.419773379 podStartE2EDuration="6.781493392s" podCreationTimestamp="2026-04-20 07:15:45 +0000 UTC" firstStartedPulling="2026-04-20 07:15:45.785120577 +0000 UTC m=+788.428429705" lastFinishedPulling="2026-04-20 07:15:51.146840589 +0000 UTC m=+793.790149718" observedRunningTime="2026-04-20 07:15:51.779646037 +0000 UTC m=+794.422955216" watchObservedRunningTime="2026-04-20 07:15:51.781493392 +0000 UTC m=+794.424802542" Apr 20 07:15:57.798697 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:15:57.798614 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-lhn74" Apr 20 07:16:16.296391 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.296352 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:16:16.299877 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.299856 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:16.302541 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.302516 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hrfk2\"" Apr 20 07:16:16.307459 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.307222 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:16:16.321835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.321813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz458\" (UniqueName: \"kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458\") pod \"maas-controller-7b46859976-28n96\" (UID: \"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6\") " pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:16.423035 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.423003 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz458\" (UniqueName: \"kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458\") pod \"maas-controller-7b46859976-28n96\" (UID: \"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6\") " pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:16.432101 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.432055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz458\" (UniqueName: \"kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458\") pod \"maas-controller-7b46859976-28n96\" (UID: \"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6\") " pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:16.611509 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.611442 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:16.736194 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.736170 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:16:16.739923 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:16:16.739885 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dbdd6d1_de03_4b88_bdb6_7e9f47492ed6.slice/crio-a09d84c648b0877ff4431dae11e2b635142145afb9ab25eca83cb242ebf04935 WatchSource:0}: Error finding container a09d84c648b0877ff4431dae11e2b635142145afb9ab25eca83cb242ebf04935: Status 404 returned error can't find the container with id a09d84c648b0877ff4431dae11e2b635142145afb9ab25eca83cb242ebf04935 Apr 20 07:16:16.865040 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:16.864965 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-28n96" event={"ID":"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6","Type":"ContainerStarted","Data":"a09d84c648b0877ff4431dae11e2b635142145afb9ab25eca83cb242ebf04935"} Apr 20 07:16:19.878842 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:19.878798 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-28n96" event={"ID":"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6","Type":"ContainerStarted","Data":"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb"} Apr 20 07:16:19.879227 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:19.878872 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:19.897337 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:19.897289 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7b46859976-28n96" podStartSLOduration=1.6678478380000001 podStartE2EDuration="3.897276411s" podCreationTimestamp="2026-04-20 07:16:16 +0000 UTC" firstStartedPulling="2026-04-20 07:16:16.742903491 +0000 UTC m=+819.386212618" lastFinishedPulling="2026-04-20 07:16:18.97233206 +0000 UTC m=+821.615641191" observedRunningTime="2026-04-20 07:16:19.895229219 +0000 UTC m=+822.538538370" watchObservedRunningTime="2026-04-20 07:16:19.897276411 +0000 UTC m=+822.540585586" Apr 20 07:16:30.889570 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:30.889538 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:16:39.767429 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.767378 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5"] Apr 20 07:16:39.771074 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.771045 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.773930 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.773901 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 07:16:39.773930 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.773930 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 07:16:39.775018 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.775000 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 07:16:39.775018 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.775018 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-w5vjj\"" Apr 20 07:16:39.780496 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.780472 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5"] Apr 20 07:16:39.837491 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.837744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.837864 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837757 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.837864 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.837985 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837902 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18a24103-ab01-4efa-b963-2e01030bf167-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.838043 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.837996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2r8q\" (UniqueName: \"kubernetes.io/projected/18a24103-ab01-4efa-b963-2e01030bf167-kube-api-access-v2r8q\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.938875 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.938875 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938916 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18a24103-ab01-4efa-b963-2e01030bf167-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.938988 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2r8q\" (UniqueName: \"kubernetes.io/projected/18a24103-ab01-4efa-b963-2e01030bf167-kube-api-access-v2r8q\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.939317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939461 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.939436 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.939684 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.939658 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.941711 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.941682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18a24103-ab01-4efa-b963-2e01030bf167-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.942039 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.942016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18a24103-ab01-4efa-b963-2e01030bf167-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:39.953579 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:39.953550 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2r8q\" (UniqueName: \"kubernetes.io/projected/18a24103-ab01-4efa-b963-2e01030bf167-kube-api-access-v2r8q\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-fm7p5\" (UID: \"18a24103-ab01-4efa-b963-2e01030bf167\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:40.083841 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:40.083756 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:40.226388 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:40.226342 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5"] Apr 20 07:16:40.226804 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:16:40.226774 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a24103_ab01_4efa_b963_2e01030bf167.slice/crio-469a31a16b88fe8be8f106edd7aaa6b862dcfba5591103e81897179338ba3d73 WatchSource:0}: Error finding container 469a31a16b88fe8be8f106edd7aaa6b862dcfba5591103e81897179338ba3d73: Status 404 returned error can't find the container with id 469a31a16b88fe8be8f106edd7aaa6b862dcfba5591103e81897179338ba3d73 Apr 20 07:16:40.966692 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:40.966646 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" event={"ID":"18a24103-ab01-4efa-b963-2e01030bf167","Type":"ContainerStarted","Data":"469a31a16b88fe8be8f106edd7aaa6b862dcfba5591103e81897179338ba3d73"} Apr 20 07:16:47.999030 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:47.998988 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" event={"ID":"18a24103-ab01-4efa-b963-2e01030bf167","Type":"ContainerStarted","Data":"69e94c833f4d622daad9bbf96ebed5f4fb515700665381718fab4c71539c71c8"} Apr 20 07:16:53.020812 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:53.020777 2566 generic.go:358] "Generic (PLEG): container finished" podID="18a24103-ab01-4efa-b963-2e01030bf167" containerID="69e94c833f4d622daad9bbf96ebed5f4fb515700665381718fab4c71539c71c8" exitCode=0 Apr 20 07:16:53.021325 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:53.021192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" event={"ID":"18a24103-ab01-4efa-b963-2e01030bf167","Type":"ContainerDied","Data":"69e94c833f4d622daad9bbf96ebed5f4fb515700665381718fab4c71539c71c8"} Apr 20 07:16:55.033638 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:55.033598 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" event={"ID":"18a24103-ab01-4efa-b963-2e01030bf167","Type":"ContainerStarted","Data":"960d0b9c64babba7516c18cb8e90f2fe91c5d1cb5bb72f40cf2206e0b5f368f9"} Apr 20 07:16:55.034008 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:55.033821 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:16:55.056143 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:16:55.056096 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" podStartSLOduration=2.053769397 podStartE2EDuration="16.056083241s" podCreationTimestamp="2026-04-20 07:16:39 +0000 UTC" firstStartedPulling="2026-04-20 07:16:40.228553179 +0000 UTC m=+842.871862310" lastFinishedPulling="2026-04-20 07:16:54.230867014 +0000 UTC m=+856.874176154" observedRunningTime="2026-04-20 07:16:55.053869304 +0000 UTC m=+857.697178456" watchObservedRunningTime="2026-04-20 07:16:55.056083241 +0000 UTC m=+857.699392392" Apr 20 07:17:06.055080 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:06.055028 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-fm7p5" Apr 20 07:17:32.465119 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.465001 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn"] Apr 20 07:17:32.468783 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.468760 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.471687 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.471667 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 07:17:32.480261 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.480238 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn"] Apr 20 07:17:32.583665 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583630 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.583665 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583666 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.583874 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bsr\" (UniqueName: \"kubernetes.io/projected/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kube-api-access-s9bsr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.583874 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583799 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce74456-ea78-40a3-be3e-7d31950c4d5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.583874 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583844 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.583983 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.583882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684293 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684257 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684293 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684515 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684515 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684515 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684396 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bsr\" (UniqueName: \"kubernetes.io/projected/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kube-api-access-s9bsr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684659 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684551 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce74456-ea78-40a3-be3e-7d31950c4d5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684784 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684765 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684863 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684804 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.684863 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.684848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.686696 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.686668 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ce74456-ea78-40a3-be3e-7d31950c4d5d-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.687088 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.687050 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce74456-ea78-40a3-be3e-7d31950c4d5d-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.694334 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.694306 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bsr\" (UniqueName: \"kubernetes.io/projected/1ce74456-ea78-40a3-be3e-7d31950c4d5d-kube-api-access-s9bsr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn\" (UID: \"1ce74456-ea78-40a3-be3e-7d31950c4d5d\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.781212 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.781183 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:32.915677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:32.915467 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn"] Apr 20 07:17:32.918764 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:17:32.918729 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce74456_ea78_40a3_be3e_7d31950c4d5d.slice/crio-8f45613d72b1d4456a61ed53bdad3c7e07197d1bc6427bb6d0052c3cdb74a0e5 WatchSource:0}: Error finding container 8f45613d72b1d4456a61ed53bdad3c7e07197d1bc6427bb6d0052c3cdb74a0e5: Status 404 returned error can't find the container with id 8f45613d72b1d4456a61ed53bdad3c7e07197d1bc6427bb6d0052c3cdb74a0e5 Apr 20 07:17:33.196036 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:33.195951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" event={"ID":"1ce74456-ea78-40a3-be3e-7d31950c4d5d","Type":"ContainerStarted","Data":"2824df5f7847678aded42de56b204c745674ad1e06976a6c5260a7d085562324"} Apr 20 07:17:33.196036 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:33.195991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" event={"ID":"1ce74456-ea78-40a3-be3e-7d31950c4d5d","Type":"ContainerStarted","Data":"8f45613d72b1d4456a61ed53bdad3c7e07197d1bc6427bb6d0052c3cdb74a0e5"} Apr 20 07:17:37.945218 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:37.945186 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:17:37.946833 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:37.946803 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:17:37.949354 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:37.949325 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:17:37.950835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:37.950815 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:17:39.064516 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.064484 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd"] Apr 20 07:17:39.068387 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.068360 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.070994 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.070973 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 07:17:39.077640 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.077616 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd"] Apr 20 07:17:39.142121 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.142297 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.142297 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142202 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.142297 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.142422 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142312 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnw4\" (UniqueName: \"kubernetes.io/projected/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kube-api-access-wwnw4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.142422 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.142360 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.236837 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.236799 2566 generic.go:358] "Generic (PLEG): container finished" podID="1ce74456-ea78-40a3-be3e-7d31950c4d5d" containerID="2824df5f7847678aded42de56b204c745674ad1e06976a6c5260a7d085562324" exitCode=0 Apr 20 07:17:39.237012 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.236859 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" event={"ID":"1ce74456-ea78-40a3-be3e-7d31950c4d5d","Type":"ContainerDied","Data":"2824df5f7847678aded42de56b204c745674ad1e06976a6c5260a7d085562324"} Apr 20 07:17:39.243602 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243581 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.243700 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243617 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.243700 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243646 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.243839 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243825 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.243886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243850 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnw4\" (UniqueName: \"kubernetes.io/projected/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kube-api-access-wwnw4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.243886 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.244046 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.243978 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.244046 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.244029 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.244250 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.244222 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.246055 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.246037 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.246517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.246501 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.251315 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.251294 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnw4\" (UniqueName: \"kubernetes.io/projected/8dfe2d7b-ee08-43a6-83fe-d9b4f674b185-kube-api-access-wwnw4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd\" (UID: \"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.380033 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.379962 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:39.526933 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:39.526897 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd"] Apr 20 07:17:39.531889 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:17:39.531863 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfe2d7b_ee08_43a6_83fe_d9b4f674b185.slice/crio-f03d3bfb96dd34630fbc7c68478318320c09f638bcbbf09aab5df5e2a19a0dae WatchSource:0}: Error finding container f03d3bfb96dd34630fbc7c68478318320c09f638bcbbf09aab5df5e2a19a0dae: Status 404 returned error can't find the container with id f03d3bfb96dd34630fbc7c68478318320c09f638bcbbf09aab5df5e2a19a0dae Apr 20 07:17:40.242951 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:40.242916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" event={"ID":"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185","Type":"ContainerStarted","Data":"cba698ca293ae50aeeb1dc97185e9cd4df74000b0f20764c28c56722e40483fa"} Apr 20 07:17:40.242951 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:40.242956 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" event={"ID":"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185","Type":"ContainerStarted","Data":"f03d3bfb96dd34630fbc7c68478318320c09f638bcbbf09aab5df5e2a19a0dae"} Apr 20 07:17:40.245180 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:40.245155 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" event={"ID":"1ce74456-ea78-40a3-be3e-7d31950c4d5d","Type":"ContainerStarted","Data":"a253402161935a1cbe6bff2978323eaca9b69355745d4687777f978ddfce0e9b"} Apr 20 07:17:40.245355 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:40.245338 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:17:40.288485 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:40.288431 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" podStartSLOduration=8.053452263 podStartE2EDuration="8.288414006s" podCreationTimestamp="2026-04-20 07:17:32 +0000 UTC" firstStartedPulling="2026-04-20 07:17:39.237694745 +0000 UTC m=+901.881003872" lastFinishedPulling="2026-04-20 07:17:39.47265647 +0000 UTC m=+902.115965615" observedRunningTime="2026-04-20 07:17:40.287201191 +0000 UTC m=+902.930510367" watchObservedRunningTime="2026-04-20 07:17:40.288414006 +0000 UTC m=+902.931723156" Apr 20 07:17:49.287565 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:49.287526 2566 generic.go:358] "Generic (PLEG): container finished" podID="8dfe2d7b-ee08-43a6-83fe-d9b4f674b185" containerID="cba698ca293ae50aeeb1dc97185e9cd4df74000b0f20764c28c56722e40483fa" exitCode=0 Apr 20 07:17:49.288002 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:49.287601 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" event={"ID":"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185","Type":"ContainerDied","Data":"cba698ca293ae50aeeb1dc97185e9cd4df74000b0f20764c28c56722e40483fa"} Apr 20 07:17:50.293958 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:50.293918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" event={"ID":"8dfe2d7b-ee08-43a6-83fe-d9b4f674b185","Type":"ContainerStarted","Data":"560c11dd632f77a8e62a71c3294e42b5a5ef264d17dd4a5d3e2d48584fdaa697"} Apr 20 07:17:50.294352 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:50.294114 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:17:50.318271 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:50.318209 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" podStartSLOduration=11.085891578 podStartE2EDuration="11.318192177s" podCreationTimestamp="2026-04-20 07:17:39 +0000 UTC" firstStartedPulling="2026-04-20 07:17:49.288270344 +0000 UTC m=+911.931579472" lastFinishedPulling="2026-04-20 07:17:49.520570937 +0000 UTC m=+912.163880071" observedRunningTime="2026-04-20 07:17:50.314463993 +0000 UTC m=+912.957773166" watchObservedRunningTime="2026-04-20 07:17:50.318192177 +0000 UTC m=+912.961501328" Apr 20 07:17:51.263928 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:17:51.263891 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn" Apr 20 07:18:01.312815 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:18:01.312776 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd" Apr 20 07:19:39.698699 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:39.698659 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:19:39.701196 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:39.698919 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7b46859976-28n96" podUID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" containerName="manager" containerID="cri-o://0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb" gracePeriod=10 Apr 20 07:19:39.960549 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:39.960485 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:19:40.020812 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.020774 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz458\" (UniqueName: \"kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458\") pod \"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6\" (UID: \"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6\") " Apr 20 07:19:40.023178 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.023135 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458" (OuterVolumeSpecName: "kube-api-access-tz458") pod "1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" (UID: "1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6"). InnerVolumeSpecName "kube-api-access-tz458". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:19:40.122038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.122003 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz458\" (UniqueName: \"kubernetes.io/projected/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6-kube-api-access-tz458\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:19:40.767686 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.767640 2566 generic.go:358] "Generic (PLEG): container finished" podID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" containerID="0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb" exitCode=0 Apr 20 07:19:40.768169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.767727 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-28n96" Apr 20 07:19:40.768169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.767731 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-28n96" event={"ID":"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6","Type":"ContainerDied","Data":"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb"} Apr 20 07:19:40.768169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.767771 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-28n96" event={"ID":"1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6","Type":"ContainerDied","Data":"a09d84c648b0877ff4431dae11e2b635142145afb9ab25eca83cb242ebf04935"} Apr 20 07:19:40.768169 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.767787 2566 scope.go:117] "RemoveContainer" containerID="0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb" Apr 20 07:19:40.778124 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.778103 2566 scope.go:117] "RemoveContainer" containerID="0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb" Apr 20 07:19:40.778386 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:19:40.778364 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb\": container with ID starting with 0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb not found: ID does not exist" containerID="0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb" Apr 20 07:19:40.778436 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.778395 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb"} err="failed to get container status \"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb\": rpc error: code = NotFound desc = could not find container \"0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb\": container with ID starting with 0458285b5553fdb05e7aa0edea898e694969a92bf4ced414a6705ba3c0912edb not found: ID does not exist" Apr 20 07:19:40.792697 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.792667 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:19:40.798570 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:40.798547 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7b46859976-28n96"] Apr 20 07:19:41.518484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.518437 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7b46859976-xmcdb"] Apr 20 07:19:41.519012 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.518993 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" containerName="manager" Apr 20 07:19:41.519129 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.519015 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" containerName="manager" Apr 20 07:19:41.519193 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.519178 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" containerName="manager" Apr 20 07:19:41.523969 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.523946 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:41.526835 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.526812 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hrfk2\"" Apr 20 07:19:41.528229 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.528207 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b46859976-xmcdb"] Apr 20 07:19:41.634596 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.634562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnl7\" (UniqueName: \"kubernetes.io/projected/a03cae53-2156-4dc1-92e5-7fe1452491d9-kube-api-access-hbnl7\") pod \"maas-controller-7b46859976-xmcdb\" (UID: \"a03cae53-2156-4dc1-92e5-7fe1452491d9\") " pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:41.735476 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.735437 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnl7\" (UniqueName: \"kubernetes.io/projected/a03cae53-2156-4dc1-92e5-7fe1452491d9-kube-api-access-hbnl7\") pod \"maas-controller-7b46859976-xmcdb\" (UID: \"a03cae53-2156-4dc1-92e5-7fe1452491d9\") " pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:41.744822 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.744789 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnl7\" (UniqueName: \"kubernetes.io/projected/a03cae53-2156-4dc1-92e5-7fe1452491d9-kube-api-access-hbnl7\") pod \"maas-controller-7b46859976-xmcdb\" (UID: \"a03cae53-2156-4dc1-92e5-7fe1452491d9\") " pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:41.836230 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.836135 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:41.940940 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.940907 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6" path="/var/lib/kubelet/pods/1dbdd6d1-de03-4b88-bdb6-7e9f47492ed6/volumes" Apr 20 07:19:41.976161 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.976136 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b46859976-xmcdb"] Apr 20 07:19:41.978272 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:19:41.978243 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03cae53_2156_4dc1_92e5_7fe1452491d9.slice/crio-545bc80260da2b00066f26ad710af0857c9fdeed0b7a68da509387c2313ddad9 WatchSource:0}: Error finding container 545bc80260da2b00066f26ad710af0857c9fdeed0b7a68da509387c2313ddad9: Status 404 returned error can't find the container with id 545bc80260da2b00066f26ad710af0857c9fdeed0b7a68da509387c2313ddad9 Apr 20 07:19:41.979521 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:41.979506 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:19:42.778309 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:42.778270 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-xmcdb" event={"ID":"a03cae53-2156-4dc1-92e5-7fe1452491d9","Type":"ContainerStarted","Data":"489ffc29632a53e15eb0ac9de6a41295df72bc4c3ca0522e0876a39646f3a72b"} Apr 20 07:19:42.778309 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:42.778311 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b46859976-xmcdb" event={"ID":"a03cae53-2156-4dc1-92e5-7fe1452491d9","Type":"ContainerStarted","Data":"545bc80260da2b00066f26ad710af0857c9fdeed0b7a68da509387c2313ddad9"} Apr 20 07:19:42.778528 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:42.778411 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:19:42.798779 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:42.798727 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7b46859976-xmcdb" podStartSLOduration=1.390236755 podStartE2EDuration="1.798711905s" podCreationTimestamp="2026-04-20 07:19:41 +0000 UTC" firstStartedPulling="2026-04-20 07:19:41.979628531 +0000 UTC m=+1024.622937658" lastFinishedPulling="2026-04-20 07:19:42.388103677 +0000 UTC m=+1025.031412808" observedRunningTime="2026-04-20 07:19:42.795417273 +0000 UTC m=+1025.438726424" watchObservedRunningTime="2026-04-20 07:19:42.798711905 +0000 UTC m=+1025.442021096" Apr 20 07:19:53.788242 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:19:53.788207 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7b46859976-xmcdb" Apr 20 07:22:37.979846 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:22:37.979814 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:22:37.982946 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:22:37.982923 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:22:37.983544 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:22:37.983527 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:22:37.986618 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:22:37.986600 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:27:38.011646 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:27:38.011613 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:27:38.015392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:27:38.015368 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:27:38.016989 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:27:38.016970 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:27:38.021192 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:27:38.021174 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:29:51.133265 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.133178 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:29:51.133791 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.133419 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" podUID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" containerName="manager" containerID="cri-o://597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0" gracePeriod=10 Apr 20 07:29:51.583445 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.583421 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:29:51.671443 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.671412 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume\") pod \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " Apr 20 07:29:51.671615 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.671471 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp7rn\" (UniqueName: \"kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn\") pod \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\" (UID: \"fff9d4a5-03a9-4201-89c5-48ea308a80f2\") " Apr 20 07:29:51.671784 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.671761 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "fff9d4a5-03a9-4201-89c5-48ea308a80f2" (UID: "fff9d4a5-03a9-4201-89c5-48ea308a80f2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:29:51.673677 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.673656 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn" (OuterVolumeSpecName: "kube-api-access-rp7rn") pod "fff9d4a5-03a9-4201-89c5-48ea308a80f2" (UID: "fff9d4a5-03a9-4201-89c5-48ea308a80f2"). InnerVolumeSpecName "kube-api-access-rp7rn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:29:51.772358 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.772295 2566 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fff9d4a5-03a9-4201-89c5-48ea308a80f2-extensions-socket-volume\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:29:51.772358 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:51.772329 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rp7rn\" (UniqueName: \"kubernetes.io/projected/fff9d4a5-03a9-4201-89c5-48ea308a80f2-kube-api-access-rp7rn\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:29:52.349705 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.349674 2566 generic.go:358] "Generic (PLEG): container finished" podID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" containerID="597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0" exitCode=0 Apr 20 07:29:52.350164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.349741 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" Apr 20 07:29:52.350164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.349745 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" event={"ID":"fff9d4a5-03a9-4201-89c5-48ea308a80f2","Type":"ContainerDied","Data":"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0"} Apr 20 07:29:52.350164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.349843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv" event={"ID":"fff9d4a5-03a9-4201-89c5-48ea308a80f2","Type":"ContainerDied","Data":"59e48d82e58544c883ebd65803fd14a832f1abf36d35ca9bc32e5ce6f14d7fd9"} Apr 20 07:29:52.350164 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.349858 2566 scope.go:117] "RemoveContainer" containerID="597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0" Apr 20 07:29:52.362320 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.362298 2566 scope.go:117] "RemoveContainer" containerID="597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0" Apr 20 07:29:52.362728 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:29:52.362706 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0\": container with ID starting with 597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0 not found: ID does not exist" containerID="597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0" Apr 20 07:29:52.362782 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.362744 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0"} err="failed to get container status \"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0\": rpc error: code = NotFound desc = could not find container \"597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0\": container with ID starting with 597a19572a908613a5e975bc2dd366dd5c0288c5541d99fad39c326942608eb0 not found: ID does not exist" Apr 20 07:29:52.373505 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.373477 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:29:52.378128 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:52.378106 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-6xlcv"] Apr 20 07:29:53.940694 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:29:53.940659 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" path="/var/lib/kubelet/pods/fff9d4a5-03a9-4201-89c5-48ea308a80f2/volumes" Apr 20 07:30:00.145913 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.145883 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:30:00.146297 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.146283 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" containerName="manager" Apr 20 07:30:00.146341 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.146298 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" containerName="manager" Apr 20 07:30:00.146379 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.146373 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fff9d4a5-03a9-4201-89c5-48ea308a80f2" containerName="manager" Apr 20 07:30:00.150603 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.150587 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:30:00.153634 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.153609 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rdqfb\"" Apr 20 07:30:00.156872 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.156850 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:30:00.243047 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.243011 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv24m\" (UniqueName: \"kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m\") pod \"maas-api-key-cleanup-29611170-x8cjn\" (UID: \"55e5fc91-09ea-432d-ae11-4d4d3720bd82\") " pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:30:00.343800 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.343762 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv24m\" (UniqueName: \"kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m\") pod \"maas-api-key-cleanup-29611170-x8cjn\" (UID: \"55e5fc91-09ea-432d-ae11-4d4d3720bd82\") " pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:30:00.352987 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.352951 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv24m\" (UniqueName: \"kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m\") pod \"maas-api-key-cleanup-29611170-x8cjn\" (UID: \"55e5fc91-09ea-432d-ae11-4d4d3720bd82\") " pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:30:00.462147 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.462012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:30:00.793396 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.793367 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:30:00.795084 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:30:00.795028 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e5fc91_09ea_432d_ae11_4d4d3720bd82.slice/crio-1a62db93af5aa74c41fc40eefcfc7477dc74181dfb84ea7881b93b78b5ea0499 WatchSource:0}: Error finding container 1a62db93af5aa74c41fc40eefcfc7477dc74181dfb84ea7881b93b78b5ea0499: Status 404 returned error can't find the container with id 1a62db93af5aa74c41fc40eefcfc7477dc74181dfb84ea7881b93b78b5ea0499 Apr 20 07:30:00.796871 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:00.796851 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:30:01.392803 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:01.392773 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerStarted","Data":"1a62db93af5aa74c41fc40eefcfc7477dc74181dfb84ea7881b93b78b5ea0499"} Apr 20 07:30:02.398447 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:02.398405 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerStarted","Data":"8801d7708a59d948ad0bacab5cc020145f93abdf9d9a6ec13c3abf4c2957259d"} Apr 20 07:30:02.418820 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:02.418762 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" podStartSLOduration=1.9130278139999999 podStartE2EDuration="2.418745802s" podCreationTimestamp="2026-04-20 07:30:00 +0000 UTC" firstStartedPulling="2026-04-20 07:30:00.797020421 +0000 UTC m=+1643.440329552" lastFinishedPulling="2026-04-20 07:30:01.302738399 +0000 UTC m=+1643.946047540" observedRunningTime="2026-04-20 07:30:02.413385631 +0000 UTC m=+1645.056694782" watchObservedRunningTime="2026-04-20 07:30:02.418745802 +0000 UTC m=+1645.062054953" Apr 20 07:30:22.485820 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:22.485713 2566 generic.go:358] "Generic (PLEG): container finished" podID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerID="8801d7708a59d948ad0bacab5cc020145f93abdf9d9a6ec13c3abf4c2957259d" exitCode=6 Apr 20 07:30:22.485820 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:22.485742 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerDied","Data":"8801d7708a59d948ad0bacab5cc020145f93abdf9d9a6ec13c3abf4c2957259d"} Apr 20 07:30:22.486277 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:22.486175 2566 scope.go:117] "RemoveContainer" containerID="8801d7708a59d948ad0bacab5cc020145f93abdf9d9a6ec13c3abf4c2957259d" Apr 20 07:30:23.491155 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:23.491115 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerStarted","Data":"c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e"} Apr 20 07:30:43.573755 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:43.573722 2566 generic.go:358] "Generic (PLEG): container finished" podID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerID="c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e" exitCode=6 Apr 20 07:30:43.574146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:43.573799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerDied","Data":"c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e"} Apr 20 07:30:43.574146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:43.573845 2566 scope.go:117] "RemoveContainer" containerID="8801d7708a59d948ad0bacab5cc020145f93abdf9d9a6ec13c3abf4c2957259d" Apr 20 07:30:43.574245 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:43.574230 2566 scope.go:117] "RemoveContainer" containerID="c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e" Apr 20 07:30:43.574484 ip-10-0-142-100 kubenswrapper[2566]: E0420 07:30:43.574453 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611170-x8cjn_opendatahub(55e5fc91-09ea-432d-ae11-4d4d3720bd82)\"" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" Apr 20 07:30:57.158949 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.158850 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n"] Apr 20 07:30:57.162569 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.162545 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.165427 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.165406 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6mqtq\"" Apr 20 07:30:57.173606 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.173578 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n"] Apr 20 07:30:57.219798 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.219770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvx5\" (UniqueName: \"kubernetes.io/projected/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-kube-api-access-2jvx5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.219926 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.219815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.321146 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.321113 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.321292 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.321201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvx5\" (UniqueName: \"kubernetes.io/projected/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-kube-api-access-2jvx5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.321504 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.321484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.329403 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.329375 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvx5\" (UniqueName: \"kubernetes.io/projected/cad24b6d-df1f-40e5-9f09-d321d0ae46e6-kube-api-access-2jvx5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xs49n\" (UID: \"cad24b6d-df1f-40e5-9f09-d321d0ae46e6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.473795 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.473708 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:57.608270 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.608239 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n"] Apr 20 07:30:57.609738 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:30:57.609716 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad24b6d_df1f_40e5_9f09_d321d0ae46e6.slice/crio-cce8deafd3253c5903767d40822b36583f322d0e894a8fbe73ec6ebbf09a555b WatchSource:0}: Error finding container cce8deafd3253c5903767d40822b36583f322d0e894a8fbe73ec6ebbf09a555b: Status 404 returned error can't find the container with id cce8deafd3253c5903767d40822b36583f322d0e894a8fbe73ec6ebbf09a555b Apr 20 07:30:57.632103 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.632046 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" event={"ID":"cad24b6d-df1f-40e5-9f09-d321d0ae46e6","Type":"ContainerStarted","Data":"cce8deafd3253c5903767d40822b36583f322d0e894a8fbe73ec6ebbf09a555b"} Apr 20 07:30:57.939364 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:57.939333 2566 scope.go:117] "RemoveContainer" containerID="c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e" Apr 20 07:30:58.638597 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:58.638560 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" event={"ID":"cad24b6d-df1f-40e5-9f09-d321d0ae46e6","Type":"ContainerStarted","Data":"86d7bc6114e7a54e9b8fb94cd52e296ebd84f84a4a68a2fcc532fb3691d6ae90"} Apr 20 07:30:58.639029 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:58.638615 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:30:58.640457 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:58.640434 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerStarted","Data":"4ecdb617409309c4241cf1d4309a555c456a14a9259b1a6f6c02d4b9002820e3"} Apr 20 07:30:58.659435 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:58.659393 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" podStartSLOduration=1.659380122 podStartE2EDuration="1.659380122s" podCreationTimestamp="2026-04-20 07:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:30:58.657480846 +0000 UTC m=+1701.300790008" watchObservedRunningTime="2026-04-20 07:30:58.659380122 +0000 UTC m=+1701.302689273" Apr 20 07:30:58.999632 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:58.999555 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:30:59.644304 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:30:59.644258 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" containerID="cri-o://4ecdb617409309c4241cf1d4309a555c456a14a9259b1a6f6c02d4b9002820e3" gracePeriod=30 Apr 20 07:31:09.645793 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:09.645762 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xs49n" Apr 20 07:31:18.728857 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.728821 2566 generic.go:358] "Generic (PLEG): container finished" podID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerID="4ecdb617409309c4241cf1d4309a555c456a14a9259b1a6f6c02d4b9002820e3" exitCode=6 Apr 20 07:31:18.729363 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.728907 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerDied","Data":"4ecdb617409309c4241cf1d4309a555c456a14a9259b1a6f6c02d4b9002820e3"} Apr 20 07:31:18.729363 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.728962 2566 scope.go:117] "RemoveContainer" containerID="c21ec97ba535d5dfc8dbf8d8b63448501ea1717537110095dffc488b30a6813e" Apr 20 07:31:18.789713 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.789688 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:31:18.918073 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.917967 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv24m\" (UniqueName: \"kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m\") pod \"55e5fc91-09ea-432d-ae11-4d4d3720bd82\" (UID: \"55e5fc91-09ea-432d-ae11-4d4d3720bd82\") " Apr 20 07:31:18.920362 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:18.920324 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m" (OuterVolumeSpecName: "kube-api-access-sv24m") pod "55e5fc91-09ea-432d-ae11-4d4d3720bd82" (UID: "55e5fc91-09ea-432d-ae11-4d4d3720bd82"). InnerVolumeSpecName "kube-api-access-sv24m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:31:19.019482 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.019442 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sv24m\" (UniqueName: \"kubernetes.io/projected/55e5fc91-09ea-432d-ae11-4d4d3720bd82-kube-api-access-sv24m\") on node \"ip-10-0-142-100.ec2.internal\" DevicePath \"\"" Apr 20 07:31:19.738389 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.738353 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" event={"ID":"55e5fc91-09ea-432d-ae11-4d4d3720bd82","Type":"ContainerDied","Data":"1a62db93af5aa74c41fc40eefcfc7477dc74181dfb84ea7881b93b78b5ea0499"} Apr 20 07:31:19.738389 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.738399 2566 scope.go:117] "RemoveContainer" containerID="4ecdb617409309c4241cf1d4309a555c456a14a9259b1a6f6c02d4b9002820e3" Apr 20 07:31:19.738905 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.738408 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611170-x8cjn" Apr 20 07:31:19.763947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.763909 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:31:19.769071 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.769014 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611170-x8cjn"] Apr 20 07:31:19.940852 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:31:19.940807 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" path="/var/lib/kubelet/pods/55e5fc91-09ea-432d-ae11-4d4d3720bd82/volumes" Apr 20 07:32:38.055931 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:32:38.055895 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:32:38.059704 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:32:38.059681 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:32:38.062319 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:32:38.062297 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:32:38.065909 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:32:38.065890 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:37:38.089515 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:37:38.089482 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:37:38.093862 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:37:38.093840 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:37:38.098705 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:37:38.098685 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:37:38.102208 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:37:38.102189 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:40:40.871920 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:40.871884 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ntjtp_7be0292d-114f-4565-a66f-a94b5341413c/manager/0.log" Apr 20 07:40:41.108502 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:41.108470 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7b46859976-xmcdb_a03cae53-2156-4dc1-92e5-7fe1452491d9/manager/0.log" Apr 20 07:40:41.224901 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:41.224801 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dj5xb_45a6d0d6-b302-4436-9926-60275a4b48db/manager/2.log" Apr 20 07:40:41.561503 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:41.561472 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-wtqxb_da92fd91-eb57-4de3-811b-ae2d4ce3c365/manager/0.log" Apr 20 07:40:41.664701 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:41.664677 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-lhn74_34e9cd18-02fc-4d57-aca9-ce9a644d737c/postgres/0.log" Apr 20 07:40:42.383139 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.383104 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/util/0.log" Apr 20 07:40:42.389509 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.389481 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/pull/0.log" Apr 20 07:40:42.395313 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.395293 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/extract/0.log" Apr 20 07:40:42.499772 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.499748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/pull/0.log" Apr 20 07:40:42.506325 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.506303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/extract/0.log" Apr 20 07:40:42.511953 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.511932 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/util/0.log" Apr 20 07:40:42.615270 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.615233 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/util/0.log" Apr 20 07:40:42.621630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.621598 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/pull/0.log" Apr 20 07:40:42.627769 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.627741 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/extract/0.log" Apr 20 07:40:42.731024 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.730938 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/util/0.log" Apr 20 07:40:42.737095 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.737051 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/pull/0.log" Apr 20 07:40:42.743291 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:42.743269 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/extract/0.log" Apr 20 07:40:43.077631 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:43.077598 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9bqnr_90691418-d1c6-4312-8fb7-c22dd32b73a8/manager/0.log" Apr 20 07:40:43.393052 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:43.392971 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xs49n_cad24b6d-df1f-40e5-9f09-d321d0ae46e6/manager/0.log" Apr 20 07:40:43.617982 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:43.617944 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9zjjc_0482519c-872b-48e1-bb33-9e50309c461e/manager/0.log" Apr 20 07:40:44.043155 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:44.043118 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mfzf5_83244e9c-7094-4f0b-93a3-67dd81ac0ad6/discovery/0.log" Apr 20 07:40:44.145042 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:44.145012 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bbb6d54d8-28vx7_c7cb3dae-237e-4f51-9470-b5f9f01164df/kube-auth-proxy/0.log" Apr 20 07:40:44.458226 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:44.458144 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69d77dd9d6-68vvp_1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff/router/0.log" Apr 20 07:40:44.887202 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:44.887173 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn_1ce74456-ea78-40a3-be3e-7d31950c4d5d/main/0.log" Apr 20 07:40:44.894548 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:44.894522 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jrfzn_1ce74456-ea78-40a3-be3e-7d31950c4d5d/storage-initializer/0.log" Apr 20 07:40:45.004114 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:45.004054 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-fm7p5_18a24103-ab01-4efa-b963-2e01030bf167/storage-initializer/0.log" Apr 20 07:40:45.011370 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:45.011347 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-fm7p5_18a24103-ab01-4efa-b963-2e01030bf167/main/0.log" Apr 20 07:40:45.336853 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:45.336812 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd_8dfe2d7b-ee08-43a6-83fe-d9b4f674b185/main/0.log" Apr 20 07:40:45.343438 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:45.343414 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-l6lhd_8dfe2d7b-ee08-43a6-83fe-d9b4f674b185/storage-initializer/0.log" Apr 20 07:40:51.951074 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:51.951035 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gxm4n_972f83a1-e686-4970-8f9a-46f92da2158f/global-pull-secret-syncer/0.log" Apr 20 07:40:52.010673 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:52.010644 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-677wv_a3d30794-2dde-421b-92b3-bb5c7855130c/konnectivity-agent/0.log" Apr 20 07:40:52.150407 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:52.150373 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-100.ec2.internal_06a23782138e9e64504f2d8b6d92effd/haproxy/0.log" Apr 20 07:40:55.561339 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.561303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/extract/0.log" Apr 20 07:40:55.582158 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.582132 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/util/0.log" Apr 20 07:40:55.602621 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.602583 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759w9j96_f84adcb3-4653-4722-8577-04d06c6c5a59/pull/0.log" Apr 20 07:40:55.630659 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.630626 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/extract/0.log" Apr 20 07:40:55.653535 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.653452 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/util/0.log" Apr 20 07:40:55.675052 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.675020 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e02klmp_ec28d2fc-0988-4fc1-8296-e35886f33ba9/pull/0.log" Apr 20 07:40:55.711328 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.711300 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/extract/0.log" Apr 20 07:40:55.730467 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.730435 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/util/0.log" Apr 20 07:40:55.751143 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.751107 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73tdzqb_30a3245f-2a4e-4c26-8b51-a76e2c24331f/pull/0.log" Apr 20 07:40:55.776493 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.776457 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/extract/0.log" Apr 20 07:40:55.795991 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.795959 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/util/0.log" Apr 20 07:40:55.817846 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:55.817817 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12kkdt_72c4f55e-7867-4831-b766-d0feea542b63/pull/0.log" Apr 20 07:40:56.177007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:56.176974 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9bqnr_90691418-d1c6-4312-8fb7-c22dd32b73a8/manager/0.log" Apr 20 07:40:56.354143 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:56.354107 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xs49n_cad24b6d-df1f-40e5-9f09-d321d0ae46e6/manager/0.log" Apr 20 07:40:56.474947 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:56.474864 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9zjjc_0482519c-872b-48e1-bb33-9e50309c461e/manager/0.log" Apr 20 07:40:58.173144 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:58.173007 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9pj7p_afe6f307-ed98-475b-8fa0-8a49de31999c/cluster-monitoring-operator/0.log" Apr 20 07:40:58.413264 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:58.413216 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ph9n4_83b47ba2-db8b-4e97-a3b0-1722dcd7d468/node-exporter/0.log" Apr 20 07:40:58.431690 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:58.431591 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ph9n4_83b47ba2-db8b-4e97-a3b0-1722dcd7d468/kube-rbac-proxy/0.log" Apr 20 07:40:58.451615 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:58.451579 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ph9n4_83b47ba2-db8b-4e97-a3b0-1722dcd7d468/init-textfile/0.log" Apr 20 07:40:58.845588 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:40:58.845545 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-x2822_ad660d5f-22ed-44cf-8958-0db2957fe5fb/prometheus-operator-admission-webhook/0.log" Apr 20 07:41:00.282501 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.282467 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fbft2_58ffde8a-6f80-4470-a18a-15c64f4d0ce1/networking-console-plugin/0.log" Apr 20 07:41:00.754971 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.754941 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq"] Apr 20 07:41:00.755368 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755356 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755422 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755370 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755422 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755389 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755422 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755394 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755536 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755458 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755536 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755467 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.755536 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.755473 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e5fc91-09ea-432d-ae11-4d4d3720bd82" containerName="cleanup" Apr 20 07:41:00.758548 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.758531 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:00.762226 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.762203 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"kube-root-ca.crt\"" Apr 20 07:41:00.763396 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.763376 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xfwg2\"/\"default-dockercfg-df242\"" Apr 20 07:41:00.763562 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.763543 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"openshift-service-ca.crt\"" Apr 20 07:41:00.771794 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.771773 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq"] Apr 20 07:41:00.862869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.862836 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/1.log" Apr 20 07:41:00.871853 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.871830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2tj5v_20140ecd-e12f-48c7-b496-5b64ba19279d/console-operator/2.log" Apr 20 07:41:00.918513 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.918480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsftq\" (UniqueName: \"kubernetes.io/projected/779dc687-5541-4e92-b8c9-113f8c5eca5f-kube-api-access-jsftq\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:00.918678 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.918524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-podres\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:00.918678 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.918549 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-sys\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:00.918678 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.918612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-proc\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:00.918678 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:00.918639 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-lib-modules\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.019720 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsftq\" (UniqueName: \"kubernetes.io/projected/779dc687-5541-4e92-b8c9-113f8c5eca5f-kube-api-access-jsftq\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.019720 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-podres\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.019910 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019792 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-podres\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.019910 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-sys\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.019910 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019890 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-proc\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.020007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-sys\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.020007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-lib-modules\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.020007 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.019984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-proc\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.020135 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.020030 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/779dc687-5541-4e92-b8c9-113f8c5eca5f-lib-modules\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.027809 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.027779 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsftq\" (UniqueName: \"kubernetes.io/projected/779dc687-5541-4e92-b8c9-113f8c5eca5f-kube-api-access-jsftq\") pod \"perf-node-gather-daemonset-b6rcq\" (UID: \"779dc687-5541-4e92-b8c9-113f8c5eca5f\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.068941 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.068904 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:01.201363 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.201333 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq"] Apr 20 07:41:01.202914 ip-10-0-142-100 kubenswrapper[2566]: W0420 07:41:01.202885 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod779dc687_5541_4e92_b8c9_113f8c5eca5f.slice/crio-edd844a106c6921919239d513de2ac201cd47f309d9797d9d88a525974f6fc44 WatchSource:0}: Error finding container edd844a106c6921919239d513de2ac201cd47f309d9797d9d88a525974f6fc44: Status 404 returned error can't find the container with id edd844a106c6921919239d513de2ac201cd47f309d9797d9d88a525974f6fc44 Apr 20 07:41:01.204583 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.204563 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:41:01.381563 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.381534 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5776bd786f-mftdz_18482489-ee72-4ec2-be2a-db333374296c/console/0.log" Apr 20 07:41:01.434683 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.434655 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-mwk27_7cc62a1d-19e0-48f5-b52d-9344c373a4eb/download-server/0.log" Apr 20 07:41:01.976556 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:01.976525 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-4pmwr_b328de84-b319-40a6-968c-43ec62190177/volume-data-source-validator/0.log" Apr 20 07:41:02.193604 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.193571 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" event={"ID":"779dc687-5541-4e92-b8c9-113f8c5eca5f","Type":"ContainerStarted","Data":"78ab1c787da322a8c1c68dd5062dc3923407824555cd410fa5c111126ee2a303"} Apr 20 07:41:02.193604 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.193607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" event={"ID":"779dc687-5541-4e92-b8c9-113f8c5eca5f","Type":"ContainerStarted","Data":"edd844a106c6921919239d513de2ac201cd47f309d9797d9d88a525974f6fc44"} Apr 20 07:41:02.193926 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.193643 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:02.209891 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.209840 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" podStartSLOduration=2.209824771 podStartE2EDuration="2.209824771s" podCreationTimestamp="2026-04-20 07:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:41:02.207426503 +0000 UTC m=+2304.850735652" watchObservedRunningTime="2026-04-20 07:41:02.209824771 +0000 UTC m=+2304.853133920" Apr 20 07:41:02.804385 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.804350 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tfcdm_0134f121-e1b5-45c9-9b45-fc3777f00742/dns/0.log" Apr 20 07:41:02.824779 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.824745 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tfcdm_0134f121-e1b5-45c9-9b45-fc3777f00742/kube-rbac-proxy/0.log" Apr 20 07:41:02.931765 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:02.931740 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-znmbh_13612f1f-99a2-46b0-abc6-8e800feca8e9/dns-node-resolver/0.log" Apr 20 07:41:03.441380 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:03.441344 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pqdp9_4151df80-e101-4d3f-a89d-f0d2c0217a27/node-ca/0.log" Apr 20 07:41:04.419484 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:04.419437 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mfzf5_83244e9c-7094-4f0b-93a3-67dd81ac0ad6/discovery/0.log" Apr 20 07:41:04.438124 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:04.438095 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bbb6d54d8-28vx7_c7cb3dae-237e-4f51-9470-b5f9f01164df/kube-auth-proxy/0.log" Apr 20 07:41:04.538801 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:04.538767 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69d77dd9d6-68vvp_1dbfa2d1-c6ce-45c6-9b1d-cbd35a6ffaff/router/0.log" Apr 20 07:41:05.039038 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.039006 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7kxww_7d3acfea-9550-4513-83ec-6e37b2e131a5/serve-healthcheck-canary/0.log" Apr 20 07:41:05.550095 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.550044 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jlbv6_31731af1-46d3-416a-98dc-3db15ceaab73/insights-operator/0.log" Apr 20 07:41:05.553326 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.553302 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jlbv6_31731af1-46d3-416a-98dc-3db15ceaab73/insights-operator/1.log" Apr 20 07:41:05.694311 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.694281 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fscsr_eae346ec-981a-4e25-9e27-86b42cb2f0f1/kube-rbac-proxy/0.log" Apr 20 07:41:05.713459 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.713430 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fscsr_eae346ec-981a-4e25-9e27-86b42cb2f0f1/exporter/0.log" Apr 20 07:41:05.733201 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:05.733171 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fscsr_eae346ec-981a-4e25-9e27-86b42cb2f0f1/extractor/0.log" Apr 20 07:41:07.619518 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.619466 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ntjtp_7be0292d-114f-4565-a66f-a94b5341413c/manager/0.log" Apr 20 07:41:07.757655 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.757608 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7b46859976-xmcdb_a03cae53-2156-4dc1-92e5-7fe1452491d9/manager/0.log" Apr 20 07:41:07.784421 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.784394 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dj5xb_45a6d0d6-b302-4436-9926-60275a4b48db/manager/1.log" Apr 20 07:41:07.810434 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.810399 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dj5xb_45a6d0d6-b302-4436-9926-60275a4b48db/manager/2.log" Apr 20 07:41:07.898630 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.898543 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-wtqxb_da92fd91-eb57-4de3-811b-ae2d4ce3c365/manager/0.log" Apr 20 07:41:07.921744 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:07.921718 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-lhn74_34e9cd18-02fc-4d57-aca9-ce9a644d737c/postgres/0.log" Apr 20 07:41:08.209939 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:08.209874 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-b6rcq" Apr 20 07:41:13.864206 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:13.864169 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ttgx9_de3d8741-a673-4d3c-9bd2-323788b79b5a/kube-storage-version-migrator-operator/1.log" Apr 20 07:41:13.865705 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:13.865673 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ttgx9_de3d8741-a673-4d3c-9bd2-323788b79b5a/kube-storage-version-migrator-operator/0.log" Apr 20 07:41:15.167965 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.167930 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/kube-multus-additional-cni-plugins/0.log" Apr 20 07:41:15.188381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.188349 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/egress-router-binary-copy/0.log" Apr 20 07:41:15.209077 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.209040 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/cni-plugins/0.log" Apr 20 07:41:15.228142 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.228113 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/bond-cni-plugin/0.log" Apr 20 07:41:15.246883 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.246859 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/routeoverride-cni/0.log" Apr 20 07:41:15.267517 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.267491 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/whereabouts-cni-bincopy/0.log" Apr 20 07:41:15.285815 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.285784 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtmbx_ade86a8b-7468-4238-b3ff-728e885f0d78/whereabouts-cni/0.log" Apr 20 07:41:15.323692 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.323659 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c66hd_b253a09a-eb71-4d38-961f-3acd58a8ed07/kube-multus/0.log" Apr 20 07:41:15.396473 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.396431 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h8v9g_fc07195a-cdd1-494d-b741-97e9b77b3f6d/network-metrics-daemon/0.log" Apr 20 07:41:15.412078 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:15.412042 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h8v9g_fc07195a-cdd1-494d-b741-97e9b77b3f6d/kube-rbac-proxy/0.log" Apr 20 07:41:16.852616 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.852586 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-controller/0.log" Apr 20 07:41:16.868767 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.868739 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/0.log" Apr 20 07:41:16.890431 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.890405 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovn-acl-logging/1.log" Apr 20 07:41:16.909869 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.909833 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/kube-rbac-proxy-node/0.log" Apr 20 07:41:16.929772 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.929742 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 07:41:16.945392 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.945361 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/northd/0.log" Apr 20 07:41:16.965381 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.965350 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/nbdb/0.log" Apr 20 07:41:16.984453 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:16.984416 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/sbdb/0.log" Apr 20 07:41:17.164390 ip-10-0-142-100 kubenswrapper[2566]: I0420 07:41:17.164306 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vz4hl_3c81fb54-953c-4feb-8550-bcf26cec6a9e/ovnkube-controller/0.log"