Apr 20 16:20:37.308649 ip-10-0-135-200 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 16:20:37.308660 ip-10-0-135-200 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 16:20:37.308667 ip-10-0-135-200 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 16:20:37.308880 ip-10-0-135-200 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 16:20:47.545958 ip-10-0-135-200 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 16:20:47.545978 ip-10-0-135-200 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 81c472844c904a6fbb2ed135a4b9c524 -- Apr 20 16:23:15.296941 ip-10-0-135-200 systemd[1]: Starting Kubernetes Kubelet... Apr 20 16:23:15.756815 ip-10-0-135-200 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:15.756815 ip-10-0-135-200 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 16:23:15.756815 ip-10-0-135-200 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:15.756815 ip-10-0-135-200 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 16:23:15.756815 ip-10-0-135-200 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:15.760016 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.759930 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 16:23:15.763797 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763783 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:15.763797 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763797 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763801 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763804 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763808 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763812 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763814 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763817 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763820 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763823 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763825 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763828 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763835 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763838 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763840 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763843 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763845 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763848 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763850 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763853 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763855 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:15.763861 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763858 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763860 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763863 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763866 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763868 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763871 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763874 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763877 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763879 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763882 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763884 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763886 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763889 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763891 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763894 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763896 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763899 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763902 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763904 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:15.764324 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763907 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763910 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763912 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763914 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763917 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763920 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763923 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763925 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763928 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763930 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763933 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763935 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763938 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763940 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763942 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763946 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763949 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763951 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763954 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763956 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:15.764836 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763959 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763962 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763964 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763967 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763970 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763972 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763981 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763984 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763986 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763989 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763991 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763994 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.763997 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764000 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764003 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764005 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764008 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764011 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764013 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764016 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:15.765319 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764019 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764023 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764028 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764032 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764035 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764039 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764404 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764410 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764413 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764416 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764420 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764423 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764426 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764428 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764431 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764434 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764436 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764439 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:15.766000 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764441 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764444 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764447 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764449 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764452 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764454 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764457 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764460 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764463 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764466 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764468 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764471 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764474 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764476 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764479 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764481 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764483 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764486 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764488 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764491 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:15.766716 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764493 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764496 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764498 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764501 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764505 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764509 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764512 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764514 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764517 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764520 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764522 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764525 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764527 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764529 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764532 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764534 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764537 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764539 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764542 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:15.767407 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764544 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764547 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764549 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764552 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764554 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764557 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764559 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764561 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764564 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764566 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764569 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764571 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764574 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764577 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764579 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764582 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764584 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764588 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764590 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764593 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:15.767933 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764595 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764597 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764600 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764603 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764605 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764607 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764610 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764612 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764615 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764617 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764620 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764622 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764625 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764627 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.764630 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765381 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765390 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765396 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765400 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765405 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765408 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 16:23:15.768416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765412 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765417 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765420 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765423 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765427 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765430 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765433 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765436 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765439 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765442 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765445 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765448 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765451 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765455 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765458 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765461 2577 flags.go:64] FLAG: --config-dir="" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765464 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765468 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765472 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765475 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765478 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765481 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765484 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765487 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 16:23:15.768941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765490 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765493 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765527 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765570 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765574 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765577 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765580 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765584 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765588 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765592 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765596 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765599 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765606 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765610 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765614 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765617 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765620 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765623 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765628 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765632 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765638 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765642 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.765645 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767723 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767847 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 16:23:15.769600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767855 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767861 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767866 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767872 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767877 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767882 2577 flags.go:64] FLAG: --help="false" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767887 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-135-200.ec2.internal" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767892 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767897 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767901 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767908 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767914 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767918 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767922 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767927 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767931 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767936 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767940 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767945 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767950 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767954 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767959 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767963 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767967 2577 flags.go:64] FLAG: --lock-file="" Apr 20 16:23:15.770207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767972 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767976 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767981 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.767999 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768004 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768008 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768013 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768019 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768024 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768029 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768032 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768037 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768040 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768044 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768047 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768050 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768053 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768056 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768059 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768063 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768066 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768074 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768077 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768080 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 16:23:15.770785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768083 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768086 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768092 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768094 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768098 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768101 2577 flags.go:64] FLAG: --port="10250" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768104 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768107 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cce4b2400325d273" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768110 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768113 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768116 2577 flags.go:64] FLAG: --register-node="true" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768118 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768121 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768125 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768128 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768131 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768135 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768139 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768142 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768145 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768148 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768151 2577 flags.go:64] FLAG: --runonce="false" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768154 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768157 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768160 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 16:23:15.771347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768163 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768166 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768169 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768172 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768175 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768178 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768181 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768183 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768186 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768189 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768192 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768195 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768200 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768203 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768206 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768211 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768213 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768216 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768219 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768222 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768225 2577 flags.go:64] FLAG: --v="2" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768229 2577 flags.go:64] FLAG: --version="false" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768234 2577 flags.go:64] FLAG: --vmodule="" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768239 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.768242 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 16:23:15.771961 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768333 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768337 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768342 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768346 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768349 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768352 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768354 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768357 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768359 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768362 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768365 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768367 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768370 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768372 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768375 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768377 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768380 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768382 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768384 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:15.772535 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768387 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768389 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768392 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768394 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768397 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768400 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768402 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768405 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768408 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768410 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768413 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768416 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768419 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768421 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768424 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768427 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768429 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768432 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768435 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768438 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:15.773001 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768440 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768443 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768445 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768448 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768450 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768452 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768455 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768457 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768460 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768462 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768465 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768467 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768469 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768472 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768474 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768477 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768479 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768482 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768484 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768487 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:15.773536 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768490 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768492 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768495 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768498 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768501 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768503 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768506 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768508 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768511 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768513 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768516 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768518 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768520 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768523 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768525 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768527 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768530 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768532 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768535 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768537 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:15.774045 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768540 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768543 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768546 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768549 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768552 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768554 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.768557 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:15.774532 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.769515 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:15.776551 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.776533 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 16:23:15.776589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.776553 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 16:23:15.776616 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776601 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:15.776616 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776606 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:15.776616 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776609 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:15.776616 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776612 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:15.776616 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776615 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776619 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776621 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776624 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776627 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776629 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776632 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776634 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776637 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776639 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776641 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776644 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776647 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776649 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776652 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776654 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776657 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776659 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776661 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:15.776773 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776664 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776666 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776669 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776672 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776688 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776690 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776693 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776695 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776698 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776701 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776703 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776706 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776708 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776711 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776713 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776716 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776719 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776722 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776725 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776727 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:15.777241 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776730 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776733 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776735 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776738 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776741 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776744 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776746 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776750 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776755 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776758 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776760 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776763 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776766 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776769 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776772 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776775 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776779 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776782 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776785 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:15.777734 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776788 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776790 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776793 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776796 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776799 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776802 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776804 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776807 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776810 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776812 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776815 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776817 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776819 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776822 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776824 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776827 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776829 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776832 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776834 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776836 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:15.778185 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776839 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776841 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776844 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776846 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.776851 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776949 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776955 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776957 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776961 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776963 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776966 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776969 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776971 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776974 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776976 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:15.778659 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776979 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776982 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776984 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776987 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776989 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776993 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.776996 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777000 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777003 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777006 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777008 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777012 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777014 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777016 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777019 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777021 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777024 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777026 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777029 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777031 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:15.779087 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777033 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777036 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777038 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777041 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777043 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777045 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777048 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777050 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777053 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777055 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777057 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777060 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777062 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777065 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777068 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777071 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777073 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777076 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777078 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777081 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:15.779556 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777083 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777085 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777088 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777090 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777093 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777095 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777098 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777100 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777103 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777105 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777107 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777110 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777112 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777115 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777117 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777120 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777122 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777124 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777127 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:15.780047 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777129 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777132 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777134 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777136 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777139 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777141 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777143 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777146 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777149 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777151 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777154 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777156 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777159 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777161 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777164 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777166 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:15.780493 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:15.777169 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:15.781012 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.777173 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:15.781012 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.778068 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 16:23:15.781012 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.780931 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 16:23:15.781927 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.781915 2577 server.go:1019] "Starting client certificate rotation" Apr 20 16:23:15.782027 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.782011 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:15.782056 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.782047 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:15.808926 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.808907 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:15.811161 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.811143 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:15.823387 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.823369 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 16:23:15.828940 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.828926 2577 log.go:25] "Validated CRI v1 image API" Apr 20 16:23:15.830130 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.830117 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 16:23:15.834408 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.834390 2577 fs.go:135] Filesystem UUIDs: map[53df7f19-dfc3-4174-939b-10cadaf2a49d:/dev/nvme0n1p4 61a50dfa-7237-4a1f-9e08-de197b8de26f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 16:23:15.834457 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.834408 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 16:23:15.839335 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.839233 2577 manager.go:217] Machine: {Timestamp:2026-04-20 16:23:15.838050573 +0000 UTC m=+0.417620580 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3118832 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2079a222cf2524c65ce4e0bc9d51ca SystemUUID:ec2079a2-22cf-2524-c65c-e4e0bc9d51ca BootID:81c47284-4c90-4a6f-bb2e-d135a4b9c524 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:70:dc:f1:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:70:dc:f1:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:71:fa:a4:68:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 16:23:15.839335 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.839328 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 16:23:15.839437 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.839397 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 16:23:15.839855 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.839840 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:15.840721 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840701 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 16:23:15.840860 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840724 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-200.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 16:23:15.840901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840870 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 16:23:15.840901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840879 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 16:23:15.840901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840893 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:15.840978 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.840908 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:15.842284 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.842274 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:15.842389 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.842380 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 16:23:15.844643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.844632 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 16:23:15.844701 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.844647 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 16:23:15.845257 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.845248 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 16:23:15.845290 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.845261 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 16:23:15.845290 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.845270 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 16:23:15.847116 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.847103 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:15.847151 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.847130 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:15.850090 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.850067 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 16:23:15.851338 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.851323 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 16:23:15.854121 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854099 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 16:23:15.854121 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854123 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854130 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854137 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854143 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854153 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854159 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854166 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854174 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854181 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854190 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 16:23:15.854298 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.854199 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 16:23:15.855332 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.855305 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 16:23:15.855399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.855345 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 16:23:15.856748 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.856723 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 16:23:15.856856 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.856771 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-200.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 16:23:15.857842 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.857821 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5lzkz" Apr 20 16:23:15.859060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.859045 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 16:23:15.859131 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.859087 2577 server.go:1295] "Started kubelet" Apr 20 16:23:15.859177 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.859154 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 16:23:15.859819 ip-10-0-135-200 systemd[1]: Started Kubernetes Kubelet. Apr 20 16:23:15.860044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.860002 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 16:23:15.860128 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.860061 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 16:23:15.861815 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.861795 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 16:23:15.862583 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.862568 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 16:23:15.863057 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.863038 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5lzkz" Apr 20 16:23:15.867308 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.867292 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:15.868127 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.868025 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 16:23:15.868339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.868317 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 16:23:15.869075 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869059 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 16:23:15.869144 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869077 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 16:23:15.869210 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869180 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 16:23:15.869263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869227 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 16:23:15.869263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869234 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 16:23:15.869353 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.869263 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:15.869429 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869414 2577 factory.go:55] Registering systemd factory Apr 20 16:23:15.869478 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869438 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 16:23:15.869662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869648 2577 factory.go:153] Registering CRI-O factory Apr 20 16:23:15.869740 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869667 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 16:23:15.869792 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869757 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 16:23:15.869792 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869789 2577 factory.go:103] Registering Raw factory Apr 20 16:23:15.869877 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.869805 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 16:23:15.870237 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.870222 2577 manager.go:319] Starting recovery of all containers Apr 20 16:23:15.871382 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.871364 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:15.873624 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.873581 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-200.ec2.internal\" not found" node="ip-10-0-135-200.ec2.internal" Apr 20 16:23:15.875280 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.875255 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-200.ec2.internal" not found Apr 20 16:23:15.880464 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.880328 2577 manager.go:324] Recovery completed Apr 20 16:23:15.884568 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.884555 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:15.886871 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.886858 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:15.886931 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.886886 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:15.886931 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.886896 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:15.887357 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.887345 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 16:23:15.887394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.887356 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 16:23:15.887394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.887372 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:15.889589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.889578 2577 policy_none.go:49] "None policy: Start" Apr 20 16:23:15.889633 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.889594 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 16:23:15.889633 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.889603 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 16:23:15.893408 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.893392 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-200.ec2.internal" not found Apr 20 16:23:15.931279 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931264 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.931291 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931301 2577 server.go:85] "Starting device plugin registration server" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931508 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931519 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931583 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931655 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.931664 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.932164 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 16:23:15.937886 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:15.932199 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:15.954508 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:15.954284 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-200.ec2.internal" not found Apr 20 16:23:16.001520 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.001490 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 16:23:16.002774 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.002751 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 16:23:16.002774 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.002773 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 16:23:16.002908 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.002791 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 16:23:16.002908 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.002796 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 16:23:16.002908 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.002824 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 16:23:16.007090 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.007050 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:16.031993 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.031977 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:16.035052 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.035039 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:16.035108 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.035066 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:16.035108 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.035076 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:16.035108 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.035097 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.056737 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.056721 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.056790 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.056740 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-200.ec2.internal\": node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.082001 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.081979 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.103333 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.103308 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal"] Apr 20 16:23:16.103413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.103388 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:16.104434 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.104419 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:16.104519 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.104446 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:16.104519 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.104455 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:16.105830 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.105819 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:16.105957 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.105944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.105989 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.105970 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:16.106741 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106724 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:16.106741 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106739 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:16.106857 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106754 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:16.106857 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106758 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:16.106857 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106770 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:16.106857 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.106771 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:16.110270 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.110253 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.110324 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.110287 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:16.111061 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.111043 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:16.111146 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.111076 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:16.111146 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.111091 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:16.136221 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.136200 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-200.ec2.internal\" not found" node="ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.140512 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.140497 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-200.ec2.internal\" not found" node="ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.182538 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.182521 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.270859 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.270769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.270859 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.270819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.270859 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.270848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd0fda5d43e36cf7513644c2d57d50e3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-200.ec2.internal\" (UID: \"dd0fda5d43e36cf7513644c2d57d50e3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.282886 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.282861 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.371388 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.371388 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.371562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd0fda5d43e36cf7513644c2d57d50e3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-200.ec2.internal\" (UID: \"dd0fda5d43e36cf7513644c2d57d50e3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.371562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd0fda5d43e36cf7513644c2d57d50e3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-200.ec2.internal\" (UID: \"dd0fda5d43e36cf7513644c2d57d50e3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.371562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.371562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.371457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f85b7ca5dcd9ead553393b46e7fd00c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal\" (UID: \"9f85b7ca5dcd9ead553393b46e7fd00c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.383457 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.383437 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.437571 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.437546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.443714 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.443695 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.484472 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.484444 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.584992 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.584913 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.685415 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.685378 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.781886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.781862 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 16:23:16.782401 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.782008 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:16.782401 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.782035 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:16.786004 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:16.785988 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-200.ec2.internal\" not found" Apr 20 16:23:16.865648 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.865580 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 16:18:15 +0000 UTC" deadline="2027-10-31 08:24:43.796126734 +0000 UTC" Apr 20 16:23:16.865648 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.865617 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13408h1m26.930514792s" Apr 20 16:23:16.867744 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.867733 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:16.879785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.879747 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:16.886752 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.886733 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:16.907778 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.907761 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mnjjp" Apr 20 16:23:16.915765 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.915748 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mnjjp" Apr 20 16:23:16.970556 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.970527 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.982240 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.982219 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:16.983950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.983938 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" Apr 20 16:23:16.991768 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:16.991742 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:17.003517 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:17.003481 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0fda5d43e36cf7513644c2d57d50e3.slice/crio-7ee40a09f72cc3951c3bbf7421221434cca5447069cfabfa7281082b641d6ab4 WatchSource:0}: Error finding container 7ee40a09f72cc3951c3bbf7421221434cca5447069cfabfa7281082b641d6ab4: Status 404 returned error can't find the container with id 7ee40a09f72cc3951c3bbf7421221434cca5447069cfabfa7281082b641d6ab4 Apr 20 16:23:17.003859 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:17.003835 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f85b7ca5dcd9ead553393b46e7fd00c.slice/crio-928510f916ab7010defeaf858dae410221ff9a902c510256eba4af5088892c0a WatchSource:0}: Error finding container 928510f916ab7010defeaf858dae410221ff9a902c510256eba4af5088892c0a: Status 404 returned error can't find the container with id 928510f916ab7010defeaf858dae410221ff9a902c510256eba4af5088892c0a Apr 20 16:23:17.008446 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.008433 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:23:17.174601 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.174534 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:17.656103 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.656025 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:17.847250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.847219 2577 apiserver.go:52] "Watching apiserver" Apr 20 16:23:17.853124 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.853102 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 16:23:17.853519 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.853497 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-z6dg9","openshift-dns/node-resolver-pzfs2","openshift-multus/multus-7rgb7","openshift-ovn-kubernetes/ovnkube-node-n8s5n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542","openshift-image-registry/node-ca-jw965","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal","openshift-multus/multus-additional-cni-plugins-fjxjp","openshift-multus/network-metrics-daemon-mpvsq","openshift-network-diagnostics/network-check-target-twdg6","openshift-network-operator/iptables-alerter-qh4fl","kube-system/global-pull-secret-syncer-9cnb7","kube-system/konnectivity-agent-62c6k","kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal"] Apr 20 16:23:17.856489 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.855907 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.858488 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.858466 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 16:23:17.858611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.858539 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 16:23:17.858611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.858580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n4wl7\"" Apr 20 16:23:17.858936 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.858898 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.859093 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.859074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.860393 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.860377 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.861413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.861378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q9l2q\"" Apr 20 16:23:17.861528 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.861434 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 16:23:17.861697 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.861665 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.862200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862177 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9ln2x\"" Apr 20 16:23:17.862289 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862233 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 16:23:17.862470 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862450 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.862543 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 16:23:17.862580 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 16:23:17.862580 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862505 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.862653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862548 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 16:23:17.862653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862451 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 16:23:17.862653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862554 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.862653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.862546 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.868872 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.868851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 16:23:17.869122 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.869359 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869341 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.869496 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869473 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dbrc9\"" Apr 20 16:23:17.869599 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869579 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.870451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869803 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.870451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.869829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7czc8\"" Apr 20 16:23:17.870451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.870015 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 16:23:17.871485 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.871434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.871756 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.871506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.872174 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:17.871813 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:17.873263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.873244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:17.874988 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.874969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:17.875088 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:17.875033 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:17.875526 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.875508 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.875737 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.875616 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.875830 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.875814 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.875903 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.875887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.876031 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.876015 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vgjrd\"" Apr 20 16:23:17.876086 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.876067 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bn2x\"" Apr 20 16:23:17.877329 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.877310 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:17.878855 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-socket-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.878960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-device-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.878960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-script-lib\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.878960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-netns\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.878960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-registration-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.878960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8c37028-00dc-4ae4-9e33-7af134c543da-serviceca\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878974 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-ovn\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.878988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjs4c\" (UniqueName: \"kubernetes.io/projected/039d415e-4ed7-4e94-8a34-f5f605b30b1d-kube-api-access-rjs4c\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-systemd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-etc-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-kubelet\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-netd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dbq\" (UniqueName: \"kubernetes.io/projected/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-kube-api-access-28dbq\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-systemd\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtf7\" (UniqueName: \"kubernetes.io/projected/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-kube-api-access-twtf7\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879207 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-hostroot\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-lib-modules\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-konnectivity-ca\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cnibin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-run\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-var-lib-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-log-socket\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-conf-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-modprobe-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-agent-certs\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-socket-dir-parent\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j65k7\" (UniqueName: \"kubernetes.io/projected/6ef3ad4b-df20-4972-844a-22faf15284d6-kube-api-access-j65k7\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldfl\" (UniqueName: \"kubernetes.io/projected/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-kube-api-access-dldfl\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-sys\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.879777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-config\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-kubernetes\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-slash\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-os-release\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-sys-fs\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-var-lib-kubelet\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879937 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.879977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-netns\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-bin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-etc-kubernetes\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880111 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7p46\"" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8c37028-00dc-4ae4-9e33-7af134c543da-host\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-env-overrides\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovn-node-metrics-cert\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-multus-certs\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.880514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880333 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-systemd-units\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-system-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-k8s-cni-cncf-io\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-daemon-config\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbgvz\" (UniqueName: \"kubernetes.io/projected/a8c37028-00dc-4ae4-9e33-7af134c543da-kube-api-access-cbgvz\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-conf\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-host\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-multus\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-tuned\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-bin\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cni-binary-copy\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysconfig\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-tmp\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-node-log\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.881521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.880755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-kubelet\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.882337 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.881366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:17.882337 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:17.881414 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:17.882337 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.881434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.884316 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.884288 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 16:23:17.884476 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.884461 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 16:23:17.884521 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.884480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vkvxh\"" Apr 20 16:23:17.916706 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.916660 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:16 +0000 UTC" deadline="2027-09-27 15:11:09.24258489 +0000 UTC" Apr 20 16:23:17.916706 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.916705 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12598h47m51.325883406s" Apr 20 16:23:17.950845 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.950822 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:17.971152 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.971126 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 16:23:17.981268 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cnibin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-run\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.981363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-var-lib-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.981363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-log-socket\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.981363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981335 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-system-cni-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99hz\" (UniqueName: \"kubernetes.io/projected/7be8427f-1eea-4919-adcf-00cd843532e2-kube-api-access-x99hz\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-run\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-var-lib-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-conf-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-conf-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-log-socket\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cnibin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-modprobe-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-agent-certs\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-modprobe-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.981576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-socket-dir-parent\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-socket-dir-parent\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j65k7\" (UniqueName: \"kubernetes.io/projected/6ef3ad4b-df20-4972-844a-22faf15284d6-kube-api-access-j65k7\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dldfl\" (UniqueName: \"kubernetes.io/projected/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-kube-api-access-dldfl\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-sys\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-config\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-os-release\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-sys\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-kubernetes\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-kubernetes\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-slash\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981981 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.981996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-os-release\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-slash\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-sys-fs\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-var-lib-kubelet\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-sys-fs\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-var-lib-kubelet\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-os-release\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-netns\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0e43e8c-48cf-454e-aed4-cf091e904570-host-slash\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-bin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-bin\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-run-netns\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-etc-kubernetes\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-etc-kubernetes\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8c37028-00dc-4ae4-9e33-7af134c543da-host\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-env-overrides\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8c37028-00dc-4ae4-9e33-7af134c543da-host\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.982962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovn-node-metrics-cert\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsfj\" (UniqueName: \"kubernetes.io/projected/e8733069-fadf-4af4-a36d-4e7f085cc317-kube-api-access-ljsfj\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-config\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-multus-certs\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-multus-certs\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-systemd-units\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-systemd-units\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:17.982723 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-system-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-system-cni-dir\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:17.982801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:18.482771134 +0000 UTC m=+3.062341126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-k8s-cni-cncf-io\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-daemon-config\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.983793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-env-overrides\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbgvz\" (UniqueName: \"kubernetes.io/projected/a8c37028-00dc-4ae4-9e33-7af134c543da-kube-api-access-cbgvz\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-conf\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-host\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-multus\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-tuned\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-bin\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.982952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-k8s-cni-cncf-io\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8733069-fadf-4af4-a36d-4e7f085cc317-tmp-dir\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cni-binary-copy\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysconfig\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-bin\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-tmp\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-node-log\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.984555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-conf\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-cni-multus\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-host\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysconfig\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8733069-fadf-4af4-a36d-4e7f085cc317-hosts-file\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-node-log\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-kubelet\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-socket-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-device-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-var-lib-kubelet\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-script-lib\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-dbus\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0e43e8c-48cf-454e-aed4-cf091e904570-iptables-alerter-script\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-cnibin\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-netns\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-registration-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8c37028-00dc-4ae4-9e33-7af134c543da-serviceca\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.985544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-socket-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-multus-daemon-config\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-device-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-ovn\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjs4c\" (UniqueName: \"kubernetes.io/projected/039d415e-4ed7-4e94-8a34-f5f605b30b1d-kube-api-access-rjs4c\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-host-run-netns\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-ovn\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-cni-binary-copy\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-systemd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-etc-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr44h\" (UniqueName: \"kubernetes.io/projected/d0e43e8c-48cf-454e-aed4-cf091e904570-kube-api-access-pr44h\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-kubelet\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-netd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ef3ad4b-df20-4972-844a-22faf15284d6-registration-dir\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28dbq\" (UniqueName: \"kubernetes.io/projected/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-kube-api-access-28dbq\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.986378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-systemd\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.983996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twtf7\" (UniqueName: \"kubernetes.io/projected/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-kube-api-access-twtf7\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-kubelet-config\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-sysctl-d\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-kubelet\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a8c37028-00dc-4ae4-9e33-7af134c543da-serviceca\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-host-cni-netd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-hostroot\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-hostroot\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-run-systemd\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-systemd\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-lib-modules\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-konnectivity-ca\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/039d415e-4ed7-4e94-8a34-f5f605b30b1d-etc-openvswitch\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.984477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-lib-modules\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.985049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-konnectivity-ca\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.987656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.985094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovnkube-script-lib\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.987656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.985922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-etc-tuned\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.985970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-tmp\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.987656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.986434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc3d9e83-ce35-494d-b14c-0fe862e9fe53-agent-certs\") pod \"konnectivity-agent-62c6k\" (UID: \"fc3d9e83-ce35-494d-b14c-0fe862e9fe53\") " pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:17.987656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.986714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/039d415e-4ed7-4e94-8a34-f5f605b30b1d-ovn-node-metrics-cert\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:17.992945 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.992922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldfl\" (UniqueName: \"kubernetes.io/projected/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-kube-api-access-dldfl\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:17.996588 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.996564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65k7\" (UniqueName: \"kubernetes.io/projected/6ef3ad4b-df20-4972-844a-22faf15284d6-kube-api-access-j65k7\") pod \"aws-ebs-csi-driver-node-tn542\" (UID: \"6ef3ad4b-df20-4972-844a-22faf15284d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:17.996801 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.996763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbgvz\" (UniqueName: \"kubernetes.io/projected/a8c37028-00dc-4ae4-9e33-7af134c543da-kube-api-access-cbgvz\") pod \"node-ca-jw965\" (UID: \"a8c37028-00dc-4ae4-9e33-7af134c543da\") " pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:17.996886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.996797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtf7\" (UniqueName: \"kubernetes.io/projected/16cbcca0-8f4d-4342-ab24-712cdd0c0b5e-kube-api-access-twtf7\") pod \"tuned-z6dg9\" (UID: \"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e\") " pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:17.997771 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.997751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dbq\" (UniqueName: \"kubernetes.io/projected/65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c-kube-api-access-28dbq\") pod \"multus-7rgb7\" (UID: \"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c\") " pod="openshift-multus/multus-7rgb7" Apr 20 16:23:17.998428 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:17.998396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjs4c\" (UniqueName: \"kubernetes.io/projected/039d415e-4ed7-4e94-8a34-f5f605b30b1d-kube-api-access-rjs4c\") pod \"ovnkube-node-n8s5n\" (UID: \"039d415e-4ed7-4e94-8a34-f5f605b30b1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:18.005982 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.005934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" event={"ID":"dd0fda5d43e36cf7513644c2d57d50e3","Type":"ContainerStarted","Data":"7ee40a09f72cc3951c3bbf7421221434cca5447069cfabfa7281082b641d6ab4"} Apr 20 16:23:18.006908 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.006888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" event={"ID":"9f85b7ca5dcd9ead553393b46e7fd00c","Type":"ContainerStarted","Data":"928510f916ab7010defeaf858dae410221ff9a902c510256eba4af5088892c0a"} Apr 20 16:23:18.085371 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0e43e8c-48cf-454e-aed4-cf091e904570-host-slash\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.085371 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsfj\" (UniqueName: \"kubernetes.io/projected/e8733069-fadf-4af4-a36d-4e7f085cc317-kube-api-access-ljsfj\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8733069-fadf-4af4-a36d-4e7f085cc317-tmp-dir\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0e43e8c-48cf-454e-aed4-cf091e904570-host-slash\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8733069-fadf-4af4-a36d-4e7f085cc317-hosts-file\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-dbus\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0e43e8c-48cf-454e-aed4-cf091e904570-iptables-alerter-script\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-cnibin\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.085578 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:18.085591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr44h\" (UniqueName: \"kubernetes.io/projected/d0e43e8c-48cf-454e-aed4-cf091e904570-kube-api-access-pr44h\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.085642 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:18.585624985 +0000 UTC m=+3.165194992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-kubelet-config\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-system-cni-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x99hz\" (UniqueName: \"kubernetes.io/projected/7be8427f-1eea-4919-adcf-00cd843532e2-kube-api-access-x99hz\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-os-release\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.085867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8733069-fadf-4af4-a36d-4e7f085cc317-tmp-dir\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.086544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.086709 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8733069-fadf-4af4-a36d-4e7f085cc317-hosts-file\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.086767 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-kubelet-config\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.086865 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.086850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/99c0543c-05e3-470f-a780-aa8b7b3fca39-dbus\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.087193 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.087168 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7be8427f-1eea-4919-adcf-00cd843532e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.087316 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.087296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0e43e8c-48cf-454e-aed4-cf091e904570-iptables-alerter-script\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.087379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.087360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-cnibin\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.087433 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.087383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-system-cni-dir\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.087578 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.087555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7be8427f-1eea-4919-adcf-00cd843532e2-os-release\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.094901 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.094876 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:18.094901 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.094900 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:18.094901 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.094910 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.095108 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.094969 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:18.5949555 +0000 UTC m=+3.174525509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.098352 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.098328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99hz\" (UniqueName: \"kubernetes.io/projected/7be8427f-1eea-4919-adcf-00cd843532e2-kube-api-access-x99hz\") pod \"multus-additional-cni-plugins-fjxjp\" (UID: \"7be8427f-1eea-4919-adcf-00cd843532e2\") " pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.099245 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.099227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr44h\" (UniqueName: \"kubernetes.io/projected/d0e43e8c-48cf-454e-aed4-cf091e904570-kube-api-access-pr44h\") pod \"iptables-alerter-qh4fl\" (UID: \"d0e43e8c-48cf-454e-aed4-cf091e904570\") " pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.099346 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.099296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsfj\" (UniqueName: \"kubernetes.io/projected/e8733069-fadf-4af4-a36d-4e7f085cc317-kube-api-access-ljsfj\") pod \"node-resolver-pzfs2\" (UID: \"e8733069-fadf-4af4-a36d-4e7f085cc317\") " pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.174484 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.174407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:18.184219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.184194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7rgb7" Apr 20 16:23:18.201015 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.200993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:18.206720 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.206698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" Apr 20 16:23:18.213512 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.213490 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw965" Apr 20 16:23:18.219026 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.219004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" Apr 20 16:23:18.226534 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.226517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pzfs2" Apr 20 16:23:18.233061 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.233043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qh4fl" Apr 20 16:23:18.237619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.237602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" Apr 20 16:23:18.488468 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.488395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:18.488636 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.488556 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:18.488636 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.488626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:19.488608463 +0000 UTC m=+4.068178458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:18.589619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.589585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:18.589806 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.589773 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:18.589880 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.589848 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:19.589825496 +0000 UTC m=+4.169395499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:18.675690 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.675646 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e43e8c_48cf_454e_aed4_cf091e904570.slice/crio-ad06329ec61abd6e83e2d7f952b119b3a7191844348b580d5c6105417e0a78b3 WatchSource:0}: Error finding container ad06329ec61abd6e83e2d7f952b119b3a7191844348b580d5c6105417e0a78b3: Status 404 returned error can't find the container with id ad06329ec61abd6e83e2d7f952b119b3a7191844348b580d5c6105417e0a78b3 Apr 20 16:23:18.685330 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.685289 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65d069e3_fe1b_4b3c_a5c6_f5a1c7d1a50c.slice/crio-8469b677a7361e7b27d855848909f4cc61c1cb3e6ae3dd8c4fdd057a3d22e617 WatchSource:0}: Error finding container 8469b677a7361e7b27d855848909f4cc61c1cb3e6ae3dd8c4fdd057a3d22e617: Status 404 returned error can't find the container with id 8469b677a7361e7b27d855848909f4cc61c1cb3e6ae3dd8c4fdd057a3d22e617 Apr 20 16:23:18.686881 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.686848 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be8427f_1eea_4919_adcf_00cd843532e2.slice/crio-bf2c5c8c530231817cb85e0465bf939a20fb3b1d9b3b36703c5fd27f181c8587 WatchSource:0}: Error finding container bf2c5c8c530231817cb85e0465bf939a20fb3b1d9b3b36703c5fd27f181c8587: Status 404 returned error can't find the container with id bf2c5c8c530231817cb85e0465bf939a20fb3b1d9b3b36703c5fd27f181c8587 Apr 20 16:23:18.687438 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.687416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c37028_00dc_4ae4_9e33_7af134c543da.slice/crio-f8d75152257603b3bd9d45510cc1de2e3f5081365a85e870df900c488d3ae7b5 WatchSource:0}: Error finding container f8d75152257603b3bd9d45510cc1de2e3f5081365a85e870df900c488d3ae7b5: Status 404 returned error can't find the container with id f8d75152257603b3bd9d45510cc1de2e3f5081365a85e870df900c488d3ae7b5 Apr 20 16:23:18.688147 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.688039 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef3ad4b_df20_4972_844a_22faf15284d6.slice/crio-e62ab554eb3c9093e6e1f6563f0fa9531661eaeeb0bf9f293ff8e1937815af4f WatchSource:0}: Error finding container e62ab554eb3c9093e6e1f6563f0fa9531661eaeeb0bf9f293ff8e1937815af4f: Status 404 returned error can't find the container with id e62ab554eb3c9093e6e1f6563f0fa9531661eaeeb0bf9f293ff8e1937815af4f Apr 20 16:23:18.688819 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.688768 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16cbcca0_8f4d_4342_ab24_712cdd0c0b5e.slice/crio-d279ac833387725ae9020b2adce0f14df43d143316810e74de880f2145aa25b4 WatchSource:0}: Error finding container d279ac833387725ae9020b2adce0f14df43d143316810e74de880f2145aa25b4: Status 404 returned error can't find the container with id d279ac833387725ae9020b2adce0f14df43d143316810e74de880f2145aa25b4 Apr 20 16:23:18.690039 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.689975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:18.690128 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.690109 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:18.690186 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.690135 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:18.690186 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.690149 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.690458 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:18.690207 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:19.69018753 +0000 UTC m=+4.269757541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.691317 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.691245 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3d9e83_ce35_494d_b14c_0fe862e9fe53.slice/crio-a443b42064eaf0450bfa267b1bac64f962b6e8af901cffd8db7168d8fb67ebf1 WatchSource:0}: Error finding container a443b42064eaf0450bfa267b1bac64f962b6e8af901cffd8db7168d8fb67ebf1: Status 404 returned error can't find the container with id a443b42064eaf0450bfa267b1bac64f962b6e8af901cffd8db7168d8fb67ebf1 Apr 20 16:23:18.694746 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:23:18.694718 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod039d415e_4ed7_4e94_8a34_f5f605b30b1d.slice/crio-b33c4aac7908aeeca0b6c7a4133eb42ab58fe1150a717bccfc6f5a676a369911 WatchSource:0}: Error finding container b33c4aac7908aeeca0b6c7a4133eb42ab58fe1150a717bccfc6f5a676a369911: Status 404 returned error can't find the container with id b33c4aac7908aeeca0b6c7a4133eb42ab58fe1150a717bccfc6f5a676a369911 Apr 20 16:23:18.917080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.917030 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:16 +0000 UTC" deadline="2027-10-31 03:45:46.868421302 +0000 UTC" Apr 20 16:23:18.917080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:18.917066 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13403h22m27.951359704s" Apr 20 16:23:19.003534 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.003463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:19.003714 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.003568 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:19.011301 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.011270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" event={"ID":"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e","Type":"ContainerStarted","Data":"d279ac833387725ae9020b2adce0f14df43d143316810e74de880f2145aa25b4"} Apr 20 16:23:19.012186 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.012167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" event={"ID":"6ef3ad4b-df20-4972-844a-22faf15284d6","Type":"ContainerStarted","Data":"e62ab554eb3c9093e6e1f6563f0fa9531661eaeeb0bf9f293ff8e1937815af4f"} Apr 20 16:23:19.013090 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.013072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw965" event={"ID":"a8c37028-00dc-4ae4-9e33-7af134c543da","Type":"ContainerStarted","Data":"f8d75152257603b3bd9d45510cc1de2e3f5081365a85e870df900c488d3ae7b5"} Apr 20 16:23:19.013976 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.013957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7rgb7" event={"ID":"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c","Type":"ContainerStarted","Data":"8469b677a7361e7b27d855848909f4cc61c1cb3e6ae3dd8c4fdd057a3d22e617"} Apr 20 16:23:19.014869 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.014832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qh4fl" event={"ID":"d0e43e8c-48cf-454e-aed4-cf091e904570","Type":"ContainerStarted","Data":"ad06329ec61abd6e83e2d7f952b119b3a7191844348b580d5c6105417e0a78b3"} Apr 20 16:23:19.016542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.016518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" event={"ID":"dd0fda5d43e36cf7513644c2d57d50e3","Type":"ContainerStarted","Data":"ce2931b0fd86b593d3aaf87e31ffd125e1b5fd58dd1220b541e2fcd1c327d906"} Apr 20 16:23:19.018226 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.018195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"b33c4aac7908aeeca0b6c7a4133eb42ab58fe1150a717bccfc6f5a676a369911"} Apr 20 16:23:19.019288 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.019268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerStarted","Data":"bf2c5c8c530231817cb85e0465bf939a20fb3b1d9b3b36703c5fd27f181c8587"} Apr 20 16:23:19.020425 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.020407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pzfs2" event={"ID":"e8733069-fadf-4af4-a36d-4e7f085cc317","Type":"ContainerStarted","Data":"3772be28c932e7a905453c6b7ff59fe36ded3281ffef31621a6c8c69e02dbe7c"} Apr 20 16:23:19.023083 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.022928 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62c6k" event={"ID":"fc3d9e83-ce35-494d-b14c-0fe862e9fe53","Type":"ContainerStarted","Data":"a443b42064eaf0450bfa267b1bac64f962b6e8af901cffd8db7168d8fb67ebf1"} Apr 20 16:23:19.499641 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.499609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:19.499814 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.499796 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:19.499875 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.499861 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:21.499842887 +0000 UTC m=+6.079412882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:19.600068 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.600035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:19.600280 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.600258 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:19.600347 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.600324 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:21.600307153 +0000 UTC m=+6.179877150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:19.701261 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:19.700494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:19.701261 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.700749 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:19.701261 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.700771 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:19.701261 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.700785 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:19.701261 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:19.700848 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:21.700829916 +0000 UTC m=+6.280399925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:20.006116 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:20.006030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:20.006628 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:20.006165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:20.006628 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:20.006537 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:20.006773 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:20.006622 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:20.046012 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:20.045979 2577 generic.go:358] "Generic (PLEG): container finished" podID="9f85b7ca5dcd9ead553393b46e7fd00c" containerID="666456be98fc9133db59cae6a85baaa9cc2f6038964e8a2be88eea036a0b0ec9" exitCode=0 Apr 20 16:23:20.046531 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:20.046511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" event={"ID":"9f85b7ca5dcd9ead553393b46e7fd00c","Type":"ContainerDied","Data":"666456be98fc9133db59cae6a85baaa9cc2f6038964e8a2be88eea036a0b0ec9"} Apr 20 16:23:20.059891 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:20.059833 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-200.ec2.internal" podStartSLOduration=4.059815675 podStartE2EDuration="4.059815675s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:19.027792125 +0000 UTC m=+3.607362162" watchObservedRunningTime="2026-04-20 16:23:20.059815675 +0000 UTC m=+4.639385694" Apr 20 16:23:21.003765 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:21.003732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:21.003985 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.003862 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:21.055255 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:21.054346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" event={"ID":"9f85b7ca5dcd9ead553393b46e7fd00c","Type":"ContainerStarted","Data":"40fdc4bee7937253902765368e905559ce244e1c8a811ca7a80b21fbc34183fe"} Apr 20 16:23:21.516436 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:21.516402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:21.516620 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.516555 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:21.516620 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.516617 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:25.516599047 +0000 UTC m=+10.096169044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:21.617182 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:21.617147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:21.617357 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.617336 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:21.617410 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.617394 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:25.617376667 +0000 UTC m=+10.196946663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:21.717872 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:21.717837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:21.718036 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.718022 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:21.718098 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.718039 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:21.718098 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.718049 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:21.718098 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:21.718089 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:25.718076997 +0000 UTC m=+10.297646993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:22.004079 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:22.004002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:22.004202 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:22.004130 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:22.004553 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:22.004530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:22.004660 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:22.004637 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:23.003044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:23.003008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:23.003477 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:23.003134 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:24.003458 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:24.003424 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:24.003923 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:24.003439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:24.003923 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:24.003591 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:24.003923 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:24.003719 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:25.003476 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:25.003361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:25.004056 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.003496 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:25.546902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:25.546832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:25.547086 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.546986 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:25.547086 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.547054 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:33.547038709 +0000 UTC m=+18.126608704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:25.647927 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:25.647891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:25.648097 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.648081 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:25.648169 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.648141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:33.648123528 +0000 UTC m=+18.227693533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:25.749855 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:25.749171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:25.749855 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.749373 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:25.749855 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.749396 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:25.749855 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.749411 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:25.749855 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:25.749473 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:33.749454734 +0000 UTC m=+18.329024738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:26.006202 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:26.005672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:26.006202 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:26.005805 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:26.006202 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:26.006015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:26.006202 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:26.006150 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:27.003010 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:27.002972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:27.003184 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:27.003113 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:28.006570 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:28.006538 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:28.006951 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:28.006540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:28.006951 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:28.006703 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:28.006951 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:28.006765 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:29.003293 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:29.003256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:29.003452 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:29.003371 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:30.006085 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:30.006043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:30.006486 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:30.006058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:30.006486 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:30.006160 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:30.006486 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:30.006274 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:31.003198 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:31.003163 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:31.003362 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:31.003278 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:32.003181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:32.003150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:32.003181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:32.003162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:32.003669 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:32.003269 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:32.003669 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:32.003401 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:33.003943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:33.003913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:33.004419 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.004012 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:33.606739 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:33.606703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:33.606923 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.606863 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:33.606991 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.606949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:49.606927102 +0000 UTC m=+34.186497095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:33.708024 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:33.707991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:33.708207 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.708138 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:33.708207 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.708201 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:49.708187926 +0000 UTC m=+34.287757923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:33.809074 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:33.809037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:33.809251 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.809176 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:33.809251 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.809201 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:33.809251 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.809214 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:33.809386 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:33.809280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:49.809259223 +0000 UTC m=+34.388829230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:34.003998 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:34.003913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:34.004440 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:34.003915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:34.004440 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:34.004013 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:34.004440 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:34.004108 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:35.003349 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:35.003312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:35.003640 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:35.003428 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:36.004321 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:36.004296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:36.004745 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:36.004452 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:36.004745 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:36.004584 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:36.004745 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:36.004705 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:37.003828 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.003544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:37.003937 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:37.003905 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:37.086538 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.086495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" event={"ID":"16cbcca0-8f4d-4342-ab24-712cdd0c0b5e","Type":"ContainerStarted","Data":"ed5ff3f846cea03ebfec204cff081156127126d2517f318e29f72b8aa32d6591"} Apr 20 16:23:37.087825 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.087801 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" event={"ID":"6ef3ad4b-df20-4972-844a-22faf15284d6","Type":"ContainerStarted","Data":"30da44e8e81f9570d8f90774bee9367fe1a5be28c021fed1c9faf5ba6685bfbb"} Apr 20 16:23:37.088852 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.088821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw965" event={"ID":"a8c37028-00dc-4ae4-9e33-7af134c543da","Type":"ContainerStarted","Data":"70857934ce07612c97cc778a1479fc45b454356cc4baac08e72464b8743baee1"} Apr 20 16:23:37.089970 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.089947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7rgb7" event={"ID":"65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c","Type":"ContainerStarted","Data":"3cd7eae8788f9d2977bdb508529fdd65eaa2123c577606532e79ef4113184b79"} Apr 20 16:23:37.092176 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:23:37.092460 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092442 2577 generic.go:358] "Generic (PLEG): container finished" podID="039d415e-4ed7-4e94-8a34-f5f605b30b1d" containerID="8f38fa39e431ef302729ab9b2ebba2a8ada59c945ced964a3332a9ba38d046f5" exitCode=1 Apr 20 16:23:37.092514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"d72509ad87826c7c6dcfab76cf093b0219b8354801cb1c201fe626e226b9718d"} Apr 20 16:23:37.092514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"68dc24d5b5731c39117fe445516b8d3ab901b45c6988389133e4ccc9b9cb859d"} Apr 20 16:23:37.092619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"cac9a2bc0797627e15fcdfd9adb7376a467df050fd0cd91e67dff7dce71d76c6"} Apr 20 16:23:37.092619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"7aa3b17e1e97ad242af47a0ff898f0da4cad4f082545755fbcf4240f672fb31f"} Apr 20 16:23:37.092619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerDied","Data":"8f38fa39e431ef302729ab9b2ebba2a8ada59c945ced964a3332a9ba38d046f5"} Apr 20 16:23:37.092619 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.092549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"f846a44d5c72c8b0d3124e47548ac6de9577a24f8c2bb2b09d517ece2daeaa8d"} Apr 20 16:23:37.093731 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.093712 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="ffb400651eb677763e755d66c7e8c369d11f49210f2ed41e2a3cf4fc94ced292" exitCode=0 Apr 20 16:23:37.093789 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.093772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"ffb400651eb677763e755d66c7e8c369d11f49210f2ed41e2a3cf4fc94ced292"} Apr 20 16:23:37.094872 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.094853 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pzfs2" event={"ID":"e8733069-fadf-4af4-a36d-4e7f085cc317","Type":"ContainerStarted","Data":"d9cdbc2e5be8eaa2bd98d9a5bd80208c85c99dad75c22589631a67b42c623e38"} Apr 20 16:23:37.096066 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.096045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62c6k" event={"ID":"fc3d9e83-ce35-494d-b14c-0fe862e9fe53","Type":"ContainerStarted","Data":"efc310e4916c4ebac7bcb794e641356d1b05ec9b0772b4e38518d23794348800"} Apr 20 16:23:37.104728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.104672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z6dg9" podStartSLOduration=3.844443788 podStartE2EDuration="21.104654846s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.691039823 +0000 UTC m=+3.270609830" lastFinishedPulling="2026-04-20 16:23:35.951250891 +0000 UTC m=+20.530820888" observedRunningTime="2026-04-20 16:23:37.104439768 +0000 UTC m=+21.684009780" watchObservedRunningTime="2026-04-20 16:23:37.104654846 +0000 UTC m=+21.684224861" Apr 20 16:23:37.105063 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.105042 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-200.ec2.internal" podStartSLOduration=21.105036592 podStartE2EDuration="21.105036592s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:21.069652809 +0000 UTC m=+5.649222852" watchObservedRunningTime="2026-04-20 16:23:37.105036592 +0000 UTC m=+21.684606607" Apr 20 16:23:37.120815 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.120773 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7rgb7" podStartSLOduration=3.847308898 podStartE2EDuration="21.12076297s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.687032966 +0000 UTC m=+3.266602959" lastFinishedPulling="2026-04-20 16:23:35.960487035 +0000 UTC m=+20.540057031" observedRunningTime="2026-04-20 16:23:37.120506061 +0000 UTC m=+21.700076076" watchObservedRunningTime="2026-04-20 16:23:37.12076297 +0000 UTC m=+21.700332984" Apr 20 16:23:37.134000 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.133957 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pzfs2" podStartSLOduration=3.903596349 podStartE2EDuration="21.133946817s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.692360747 +0000 UTC m=+3.271930740" lastFinishedPulling="2026-04-20 16:23:35.922711214 +0000 UTC m=+20.502281208" observedRunningTime="2026-04-20 16:23:37.133510477 +0000 UTC m=+21.713080482" watchObservedRunningTime="2026-04-20 16:23:37.133946817 +0000 UTC m=+21.713516831" Apr 20 16:23:37.171174 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.171137 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-62c6k" podStartSLOduration=3.9161504689999997 podStartE2EDuration="21.171125418s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.696088 +0000 UTC m=+3.275658006" lastFinishedPulling="2026-04-20 16:23:35.951062956 +0000 UTC m=+20.530632955" observedRunningTime="2026-04-20 16:23:37.170655493 +0000 UTC m=+21.750225505" watchObservedRunningTime="2026-04-20 16:23:37.171125418 +0000 UTC m=+21.750695424" Apr 20 16:23:37.184691 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.184640 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jw965" podStartSLOduration=11.949984122 podStartE2EDuration="21.184630967s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.689647595 +0000 UTC m=+3.269217591" lastFinishedPulling="2026-04-20 16:23:27.924294425 +0000 UTC m=+12.503864436" observedRunningTime="2026-04-20 16:23:37.184485415 +0000 UTC m=+21.764055440" watchObservedRunningTime="2026-04-20 16:23:37.184630967 +0000 UTC m=+21.764200981" Apr 20 16:23:37.487102 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.487078 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 16:23:37.943083 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.942944 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T16:23:37.487095228Z","UUID":"d46a0bec-bde2-42e2-b812-dda73d848d75","Handler":null,"Name":"","Endpoint":""} Apr 20 16:23:37.944821 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.944799 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 16:23:37.944942 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:37.944828 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 16:23:38.004855 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:38.004831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:38.005043 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:38.004998 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:38.005304 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:38.005164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:38.005304 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:38.005261 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:38.099802 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:38.099752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" event={"ID":"6ef3ad4b-df20-4972-844a-22faf15284d6","Type":"ContainerStarted","Data":"544a776d51b754728d914b0242d33e55b2269928f51805a9dfc31991c2a1eb8c"} Apr 20 16:23:38.101261 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:38.101229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qh4fl" event={"ID":"d0e43e8c-48cf-454e-aed4-cf091e904570","Type":"ContainerStarted","Data":"3ff79dcf0f38c80dc6ef73e01230e87b3deb3039ec235c6a3417935ed04a1e57"} Apr 20 16:23:38.116546 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:38.116493 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qh4fl" podStartSLOduration=4.84458064 podStartE2EDuration="22.116477389s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.678872909 +0000 UTC m=+3.258442906" lastFinishedPulling="2026-04-20 16:23:35.950769658 +0000 UTC m=+20.530339655" observedRunningTime="2026-04-20 16:23:38.115982378 +0000 UTC m=+22.695552400" watchObservedRunningTime="2026-04-20 16:23:38.116477389 +0000 UTC m=+22.696047405" Apr 20 16:23:39.004159 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:39.003990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:39.004333 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:39.004224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:39.108633 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:39.108563 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:23:39.109048 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:39.108971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"2283e73a5df6815e820e4bd648783fc8fd0941f91a49ad06cff1f3a04651b8dc"} Apr 20 16:23:39.110895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:39.110867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" event={"ID":"6ef3ad4b-df20-4972-844a-22faf15284d6","Type":"ContainerStarted","Data":"1dd0fe98d2bcd715e5d054d69343d83db4d82d54400c1f10d897898eb76a430f"} Apr 20 16:23:39.131230 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:39.131189 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tn542" podStartSLOduration=3.30633492 podStartE2EDuration="23.131177976s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.689863081 +0000 UTC m=+3.269433080" lastFinishedPulling="2026-04-20 16:23:38.514706141 +0000 UTC m=+23.094276136" observedRunningTime="2026-04-20 16:23:39.130604031 +0000 UTC m=+23.710174047" watchObservedRunningTime="2026-04-20 16:23:39.131177976 +0000 UTC m=+23.710747990" Apr 20 16:23:40.004053 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:40.004021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:40.004053 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:40.004045 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:40.004320 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:40.004146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:40.004320 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:40.004289 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:41.003132 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:41.003096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:41.003768 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:41.003201 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:41.825143 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:41.824965 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:41.825436 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:41.825419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:42.003105 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.003073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:42.003105 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.003094 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:42.003770 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:42.003186 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:42.003770 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:42.003324 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:42.119821 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.119747 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:23:42.120113 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.120093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"71629c11d68390318c62ac8a294d73bb9b4b45209db783df5dd8aa1a9ea4af83"} Apr 20 16:23:42.120525 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.120492 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:42.120525 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.120520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:42.120661 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.120611 2577 scope.go:117] "RemoveContainer" containerID="8f38fa39e431ef302729ab9b2ebba2a8ada59c945ced964a3332a9ba38d046f5" Apr 20 16:23:42.121973 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.121947 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="7ef4cbecfb6723c0348d5f1e644a4ff2482dd67333fd6c02bdef6bdb5b31fe46" exitCode=0 Apr 20 16:23:42.122047 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.122009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"7ef4cbecfb6723c0348d5f1e644a4ff2482dd67333fd6c02bdef6bdb5b31fe46"} Apr 20 16:23:42.139426 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:42.139405 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:43.007164 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.007085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:43.007494 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:43.007187 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:43.096228 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.096203 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9cnb7"] Apr 20 16:23:43.096341 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.096303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:43.096392 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:43.096375 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:43.099641 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.099617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-twdg6"] Apr 20 16:23:43.100132 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.100112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mpvsq"] Apr 20 16:23:43.100244 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.100207 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:43.100302 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:43.100280 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:43.127340 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.127313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:23:43.127767 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.127735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" event={"ID":"039d415e-4ed7-4e94-8a34-f5f605b30b1d","Type":"ContainerStarted","Data":"698731ffd55b167dcbb2a5ff212fb641a80d59d056a3490bf68afea138231610"} Apr 20 16:23:43.127992 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.127975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:43.130140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.130114 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="d54efd9b3f778cefb61a6d89f2a994c19c393f6c6c9939a8e9a20c8bf9228d66" exitCode=0 Apr 20 16:23:43.130244 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.130183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:43.130244 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.130186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"d54efd9b3f778cefb61a6d89f2a994c19c393f6c6c9939a8e9a20c8bf9228d66"} Apr 20 16:23:43.130401 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:43.130374 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:43.146199 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.146177 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:23:43.154791 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:43.154748 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" podStartSLOduration=9.854420515 podStartE2EDuration="27.154735443s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.696539134 +0000 UTC m=+3.276109128" lastFinishedPulling="2026-04-20 16:23:35.996854052 +0000 UTC m=+20.576424056" observedRunningTime="2026-04-20 16:23:43.153227255 +0000 UTC m=+27.732797269" watchObservedRunningTime="2026-04-20 16:23:43.154735443 +0000 UTC m=+27.734305459" Apr 20 16:23:44.133337 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:44.133118 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="a1158431e36e62ac3598e1824eef6bfaae10d44561c989959425f8394d3a58f1" exitCode=0 Apr 20 16:23:44.133673 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:44.133209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"a1158431e36e62ac3598e1824eef6bfaae10d44561c989959425f8394d3a58f1"} Apr 20 16:23:44.558158 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:44.558070 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:44.558320 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:44.558230 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 16:23:44.558665 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:44.558641 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-62c6k" Apr 20 16:23:45.004037 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:45.004005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:45.004215 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:45.004005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:45.004215 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:45.004133 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:45.004215 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:45.004185 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:45.004215 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:45.004006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:45.004383 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:45.004320 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:47.003999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:47.003969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:47.004583 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:47.003977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:47.004583 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:47.004077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:47.004583 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:47.004157 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:47.004583 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:47.004211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:47.004583 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:47.004302 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:49.004096 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.004009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:49.004664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.004009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:49.004664 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.004144 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9cnb7" podUID="99c0543c-05e3-470f-a780-aa8b7b3fca39" Apr 20 16:23:49.004664 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.004229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:23:49.004664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.004009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:49.004664 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.004298 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-twdg6" podUID="d9f54062-202f-4820-a4d0-ec110704a2f5" Apr 20 16:23:49.250176 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.250139 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-200.ec2.internal" event="NodeReady" Apr 20 16:23:49.250351 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.250312 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 16:23:49.293160 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.293074 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7j9p9"] Apr 20 16:23:49.313176 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.313151 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xqtvp"] Apr 20 16:23:49.313351 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.313323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.315818 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.315795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 16:23:49.315986 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.315969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:23:49.316130 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.316111 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 16:23:49.322559 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.322535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j9p9"] Apr 20 16:23:49.322559 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.322558 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xqtvp"] Apr 20 16:23:49.322735 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.322654 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.325513 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.325278 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 16:23:49.325513 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.325287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 16:23:49.325513 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.325317 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 16:23:49.325513 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.325347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:23:49.435642 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.435642 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwds\" (UniqueName: \"kubernetes.io/projected/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-kube-api-access-6pwds\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.435911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.435911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f87ef63b-de21-49e4-89ee-c732444e83a3-config-volume\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.435911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f87ef63b-de21-49e4-89ee-c732444e83a3-tmp-dir\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.435911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.435809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65dcv\" (UniqueName: \"kubernetes.io/projected/f87ef63b-de21-49e4-89ee-c732444e83a3-kube-api-access-65dcv\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f87ef63b-de21-49e4-89ee-c732444e83a3-config-volume\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537293 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f87ef63b-de21-49e4-89ee-c732444e83a3-tmp-dir\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537293 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65dcv\" (UniqueName: \"kubernetes.io/projected/f87ef63b-de21-49e4-89ee-c732444e83a3-kube-api-access-65dcv\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537356 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.537284 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:49.537356 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.537356 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwds\" (UniqueName: \"kubernetes.io/projected/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-kube-api-access-6pwds\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.537479 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.537368 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.037348771 +0000 UTC m=+34.616918763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:23:49.537479 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f87ef63b-de21-49e4-89ee-c732444e83a3-tmp-dir\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.537479 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.537450 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:49.537594 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.537507 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:50.037488348 +0000 UTC m=+34.617058341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:23:49.537644 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.537627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f87ef63b-de21-49e4-89ee-c732444e83a3-config-volume\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.547672 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.547622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65dcv\" (UniqueName: \"kubernetes.io/projected/f87ef63b-de21-49e4-89ee-c732444e83a3-kube-api-access-65dcv\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:49.547894 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.547874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwds\" (UniqueName: \"kubernetes.io/projected/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-kube-api-access-6pwds\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:49.638038 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.638015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:49.638138 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.638125 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:49.638189 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.638180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:21.638167306 +0000 UTC m=+66.217737299 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:49.739271 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.739240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:49.739434 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.739413 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:49.739506 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.739493 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret podName:99c0543c-05e3-470f-a780-aa8b7b3fca39 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:21.739472829 +0000 UTC m=+66.319042837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret") pod "global-pull-secret-syncer-9cnb7" (UID: "99c0543c-05e3-470f-a780-aa8b7b3fca39") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:49.840321 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:49.840249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:49.840454 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.840405 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:49.840454 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.840429 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:49.840454 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.840439 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dn85t for pod openshift-network-diagnostics/network-check-target-twdg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:49.840573 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:49.840493 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t podName:d9f54062-202f-4820-a4d0-ec110704a2f5 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:21.840476956 +0000 UTC m=+66.420046953 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dn85t" (UniqueName: "kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t") pod "network-check-target-twdg6" (UID: "d9f54062-202f-4820-a4d0-ec110704a2f5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:50.041705 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:50.041665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:50.042105 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:50.041732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:50.042105 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:50.041801 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:50.042105 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:50.041820 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:50.042105 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:50.041879 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:51.04185983 +0000 UTC m=+35.621429831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:23:50.042105 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:50.041900 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:51.041890082 +0000 UTC m=+35.621460079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:23:50.146498 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:50.146330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerStarted","Data":"f4e106ac6db4ba731be1f24a01a7040bc04802f6ebf36aeec12fcd9232b6dc9d"} Apr 20 16:23:51.003084 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.003042 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:23:51.003236 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.003048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:23:51.003236 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.003061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:23:51.005811 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.005787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:23:51.005945 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.005928 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:23:51.007135 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.007109 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:23:51.007135 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.007123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:23:51.007239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.007123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:23:51.007239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.007149 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 16:23:51.049776 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.049748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:51.050061 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.049792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:51.050061 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:51.049883 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:51.050061 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:51.049885 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:51.050061 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:51.049931 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:53.049917517 +0000 UTC m=+37.629487513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:23:51.050061 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:51.049942 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:53.049936901 +0000 UTC m=+37.629506894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:23:51.150491 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.150450 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="f4e106ac6db4ba731be1f24a01a7040bc04802f6ebf36aeec12fcd9232b6dc9d" exitCode=0 Apr 20 16:23:51.150611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:51.150521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"f4e106ac6db4ba731be1f24a01a7040bc04802f6ebf36aeec12fcd9232b6dc9d"} Apr 20 16:23:52.156046 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:52.156011 2577 generic.go:358] "Generic (PLEG): container finished" podID="7be8427f-1eea-4919-adcf-00cd843532e2" containerID="c69577b89e5ce6194da6c6076a350b1cd1309ac05910bdd6d24ce95c2d5d2651" exitCode=0 Apr 20 16:23:52.156460 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:52.156063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerDied","Data":"c69577b89e5ce6194da6c6076a350b1cd1309ac05910bdd6d24ce95c2d5d2651"} Apr 20 16:23:53.063656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:53.063580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:53.063656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:53.063628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:53.063834 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:53.063735 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:53.063834 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:53.063752 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:53.063834 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:53.063784 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:57.063771389 +0000 UTC m=+41.643341382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:23:53.063834 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:53.063804 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:57.06379062 +0000 UTC m=+41.643360613 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:23:53.160495 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:53.160463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" event={"ID":"7be8427f-1eea-4919-adcf-00cd843532e2","Type":"ContainerStarted","Data":"2a26a9bd9f77dda3cf0f0a8d71d779ba747d4fd7395c2d9c40585f401e37548f"} Apr 20 16:23:53.183013 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:53.182966 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fjxjp" podStartSLOduration=5.951291159 podStartE2EDuration="37.182952151s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:18.68837918 +0000 UTC m=+3.267949180" lastFinishedPulling="2026-04-20 16:23:49.92004018 +0000 UTC m=+34.499610172" observedRunningTime="2026-04-20 16:23:53.182788842 +0000 UTC m=+37.762358856" watchObservedRunningTime="2026-04-20 16:23:53.182952151 +0000 UTC m=+37.762522165" Apr 20 16:23:57.092562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:57.092507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:23:57.092965 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:23:57.092593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:23:57.092965 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:57.092652 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:57.092965 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:57.092672 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:57.092965 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:57.092741 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:05.092725017 +0000 UTC m=+49.672295010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:23:57.092965 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:23:57.092756 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:05.092750216 +0000 UTC m=+49.672320209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:24:05.152964 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:05.152927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:24:05.153499 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:05.152983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:24:05.153499 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:05.153068 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:05.153499 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:05.153074 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:05.153499 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:05.153125 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:21.153109338 +0000 UTC m=+65.732679331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:24:05.153499 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:05.153139 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:21.153131815 +0000 UTC m=+65.732701807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:24:15.147289 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:15.147262 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8s5n" Apr 20 16:24:21.160912 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.160875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:24:21.161307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.160925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:24:21.161307 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.161014 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:21.161307 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.161027 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:21.161307 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.161073 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:53.161058349 +0000 UTC m=+97.740628342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:24:21.161307 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.161093 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:53.161079673 +0000 UTC m=+97.740649666 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:24:21.663898 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.663854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:24:21.666438 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.666418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:24:21.674417 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.674402 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:24:21.674497 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:21.674453 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:25.674438363 +0000 UTC m=+130.254008355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : secret "metrics-daemon-secret" not found Apr 20 16:24:21.765074 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.765041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:24:21.767807 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.767790 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 16:24:21.778911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.778884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/99c0543c-05e3-470f-a780-aa8b7b3fca39-original-pull-secret\") pod \"global-pull-secret-syncer-9cnb7\" (UID: \"99c0543c-05e3-470f-a780-aa8b7b3fca39\") " pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:24:21.865665 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.865636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:24:21.868486 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.868469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:24:21.878745 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.878732 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:24:21.889047 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.889026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn85t\" (UniqueName: \"kubernetes.io/projected/d9f54062-202f-4820-a4d0-ec110704a2f5-kube-api-access-dn85t\") pod \"network-check-target-twdg6\" (UID: \"d9f54062-202f-4820-a4d0-ec110704a2f5\") " pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:24:21.920661 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.920608 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9cnb7" Apr 20 16:24:21.927779 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.927758 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:24:21.935943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:21.935925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:24:22.069999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:22.069966 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9cnb7"] Apr 20 16:24:22.075082 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:24:22.075049 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c0543c_05e3_470f_a780_aa8b7b3fca39.slice/crio-3d30b2cc8d67152de3bf80eaa696e798f8a6fcea996e287991ab721bc4dba305 WatchSource:0}: Error finding container 3d30b2cc8d67152de3bf80eaa696e798f8a6fcea996e287991ab721bc4dba305: Status 404 returned error can't find the container with id 3d30b2cc8d67152de3bf80eaa696e798f8a6fcea996e287991ab721bc4dba305 Apr 20 16:24:22.087047 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:22.087026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-twdg6"] Apr 20 16:24:22.090098 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:24:22.090077 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f54062_202f_4820_a4d0_ec110704a2f5.slice/crio-05955b6f3eafd6c692612131727fda71631c943e9857a280a81df639e57bcadb WatchSource:0}: Error finding container 05955b6f3eafd6c692612131727fda71631c943e9857a280a81df639e57bcadb: Status 404 returned error can't find the container with id 05955b6f3eafd6c692612131727fda71631c943e9857a280a81df639e57bcadb Apr 20 16:24:22.212600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:22.212511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-twdg6" event={"ID":"d9f54062-202f-4820-a4d0-ec110704a2f5","Type":"ContainerStarted","Data":"05955b6f3eafd6c692612131727fda71631c943e9857a280a81df639e57bcadb"} Apr 20 16:24:22.213426 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:22.213405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9cnb7" event={"ID":"99c0543c-05e3-470f-a780-aa8b7b3fca39","Type":"ContainerStarted","Data":"3d30b2cc8d67152de3bf80eaa696e798f8a6fcea996e287991ab721bc4dba305"} Apr 20 16:24:27.225999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:27.225958 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-twdg6" event={"ID":"d9f54062-202f-4820-a4d0-ec110704a2f5","Type":"ContainerStarted","Data":"3f89014008fbb065af20e3433d6149a36d5e4d6486c87ce62ffb4d5c151148fc"} Apr 20 16:24:27.226440 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:27.226081 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:24:27.227307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:27.227286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9cnb7" event={"ID":"99c0543c-05e3-470f-a780-aa8b7b3fca39","Type":"ContainerStarted","Data":"521177acaf116977f1f8b0a548cab291b28621d5ade3af2ac61772d9d455ecbe"} Apr 20 16:24:27.255034 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:27.254945 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9cnb7" podStartSLOduration=65.331362534 podStartE2EDuration="1m10.254922361s" podCreationTimestamp="2026-04-20 16:23:17 +0000 UTC" firstStartedPulling="2026-04-20 16:24:22.076664768 +0000 UTC m=+66.656234761" lastFinishedPulling="2026-04-20 16:24:27.000224563 +0000 UTC m=+71.579794588" observedRunningTime="2026-04-20 16:24:27.254636367 +0000 UTC m=+71.834206383" watchObservedRunningTime="2026-04-20 16:24:27.254922361 +0000 UTC m=+71.834492377" Apr 20 16:24:27.255209 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:27.255187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-twdg6" podStartSLOduration=66.345857087 podStartE2EDuration="1m11.255181274s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:24:22.091789111 +0000 UTC m=+66.671359103" lastFinishedPulling="2026-04-20 16:24:27.001113283 +0000 UTC m=+71.580683290" observedRunningTime="2026-04-20 16:24:27.241073108 +0000 UTC m=+71.820643113" watchObservedRunningTime="2026-04-20 16:24:27.255181274 +0000 UTC m=+71.834751266" Apr 20 16:24:53.176451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:53.176413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:24:53.176920 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:53.176475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:24:53.176920 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:53.176570 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:53.176920 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:53.176604 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:53.176920 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:53.176641 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert podName:66ddeaa0-b37e-4d1b-8043-b74a8bb883a8 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:57.176625797 +0000 UTC m=+161.756195790 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert") pod "ingress-canary-xqtvp" (UID: "66ddeaa0-b37e-4d1b-8043-b74a8bb883a8") : secret "canary-serving-cert" not found Apr 20 16:24:53.176920 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:24:53.176666 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls podName:f87ef63b-de21-49e4-89ee-c732444e83a3 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:57.176649895 +0000 UTC m=+161.756219888 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls") pod "dns-default-7j9p9" (UID: "f87ef63b-de21-49e4-89ee-c732444e83a3") : secret "dns-default-metrics-tls" not found Apr 20 16:24:58.232000 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:24:58.231965 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-twdg6" Apr 20 16:25:25.687388 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:25.687335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:25:25.687995 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:25.687510 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:25:25.687995 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:25.687610 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs podName:ec42a0e4-ff1e-48d5-8b45-fab851d223a4 nodeName:}" failed. No retries permitted until 2026-04-20 16:27:27.687588271 +0000 UTC m=+252.267158278 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs") pod "network-metrics-daemon-mpvsq" (UID: "ec42a0e4-ff1e-48d5-8b45-fab851d223a4") : secret "metrics-daemon-secret" not found Apr 20 16:25:32.740049 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.740018 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xv6j6"] Apr 20 16:25:32.742632 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.742615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.745259 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.745238 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.745386 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.745269 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.745514 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.745498 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 16:25:32.746528 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.746506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-969rg\"" Apr 20 16:25:32.746614 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.746574 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 16:25:32.750803 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.750781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 16:25:32.752194 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.752172 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xv6j6"] Apr 20 16:25:32.831991 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.831955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-trusted-ca\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.832177 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.832035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-config\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.832177 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.832070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ddd4906-e010-4e9e-89d5-6017138ff6a9-serving-cert\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.832177 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.832125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr295\" (UniqueName: \"kubernetes.io/projected/1ddd4906-e010-4e9e-89d5-6017138ff6a9-kube-api-access-xr295\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.837469 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.837445 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd"] Apr 20 16:25:32.840064 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.840049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:32.842591 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.842569 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.842784 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.842597 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 16:25:32.842784 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.842644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.842784 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.842670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9rg8f\"" Apr 20 16:25:32.842784 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.842712 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 16:25:32.844440 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.844423 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz"] Apr 20 16:25:32.847017 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.846992 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" Apr 20 16:25:32.849155 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.849136 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd"] Apr 20 16:25:32.849653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.849636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-56kfg\"" Apr 20 16:25:32.857694 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.857661 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz"] Apr 20 16:25:32.932735 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-config\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.932902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ddd4906-e010-4e9e-89d5-6017138ff6a9-serving-cert\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.932902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f739da69-7df9-40a8-8c4e-cef36ba94452-config\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:32.932902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr295\" (UniqueName: \"kubernetes.io/projected/1ddd4906-e010-4e9e-89d5-6017138ff6a9-kube-api-access-xr295\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.932902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f739da69-7df9-40a8-8c4e-cef36ba94452-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:32.932902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz98x\" (UniqueName: \"kubernetes.io/projected/f739da69-7df9-40a8-8c4e-cef36ba94452-kube-api-access-sz98x\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:32.933081 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-trusted-ca\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.933081 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.932975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5fr\" (UniqueName: \"kubernetes.io/projected/af014e64-9dcb-485c-b88d-8e87419d5399-kube-api-access-sk5fr\") pod \"network-check-source-8894fc9bd-vwzmz\" (UID: \"af014e64-9dcb-485c-b88d-8e87419d5399\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" Apr 20 16:25:32.933540 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.933468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-config\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.933842 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.933824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ddd4906-e010-4e9e-89d5-6017138ff6a9-trusted-ca\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.935238 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.935214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ddd4906-e010-4e9e-89d5-6017138ff6a9-serving-cert\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.940268 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.940249 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k72wm"] Apr 20 16:25:32.943132 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.943116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:32.944005 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.943986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr295\" (UniqueName: \"kubernetes.io/projected/1ddd4906-e010-4e9e-89d5-6017138ff6a9-kube-api-access-xr295\") pod \"console-operator-9d4b6777b-xv6j6\" (UID: \"1ddd4906-e010-4e9e-89d5-6017138ff6a9\") " pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:32.945813 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.945795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 16:25:32.945813 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.945807 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 16:25:32.946027 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.945819 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 16:25:32.946027 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.945833 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 16:25:32.946027 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.945813 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-88kd2\"" Apr 20 16:25:32.953084 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.953057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k72wm"] Apr 20 16:25:32.954696 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:32.954653 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 16:25:33.033634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f739da69-7df9-40a8-8c4e-cef36ba94452-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.033634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz98x\" (UniqueName: \"kubernetes.io/projected/f739da69-7df9-40a8-8c4e-cef36ba94452-kube-api-access-sz98x\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.033634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5fr\" (UniqueName: \"kubernetes.io/projected/af014e64-9dcb-485c-b88d-8e87419d5399-kube-api-access-sk5fr\") pod \"network-check-source-8894fc9bd-vwzmz\" (UID: \"af014e64-9dcb-485c-b88d-8e87419d5399\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" Apr 20 16:25:33.033634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-tmp\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.033922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-snapshots\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.033922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-service-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.033922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1380f1d3-60b6-4093-a429-bd909b4729e8-serving-cert\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.033922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.033922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwg7f\" (UniqueName: \"kubernetes.io/projected/1380f1d3-60b6-4093-a429-bd909b4729e8-kube-api-access-lwg7f\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.034081 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.033944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f739da69-7df9-40a8-8c4e-cef36ba94452-config\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.034375 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.034359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f739da69-7df9-40a8-8c4e-cef36ba94452-config\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.035955 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.035936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f739da69-7df9-40a8-8c4e-cef36ba94452-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.041268 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.041245 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:25:33.044100 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.044086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.046699 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.046661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 16:25:33.046799 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.046711 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n225r\"" Apr 20 16:25:33.046799 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.046711 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 16:25:33.047054 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.047039 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 16:25:33.048124 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.048096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5fr\" (UniqueName: \"kubernetes.io/projected/af014e64-9dcb-485c-b88d-8e87419d5399-kube-api-access-sk5fr\") pod \"network-check-source-8894fc9bd-vwzmz\" (UID: \"af014e64-9dcb-485c-b88d-8e87419d5399\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" Apr 20 16:25:33.048381 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.048359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz98x\" (UniqueName: \"kubernetes.io/projected/f739da69-7df9-40a8-8c4e-cef36ba94452-kube-api-access-sz98x\") pod \"service-ca-operator-d6fc45fc5-t2pxd\" (UID: \"f739da69-7df9-40a8-8c4e-cef36ba94452\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.051106 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.051089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:33.052082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.051818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 16:25:33.055015 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.054987 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:25:33.134950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.134916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-tmp\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135134 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135134 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135134 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-service-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135544 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1380f1d3-60b6-4093-a429-bd909b4729e8-serving-cert\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135627 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwg7f\" (UniqueName: \"kubernetes.io/projected/1380f1d3-60b6-4093-a429-bd909b4729e8-kube-api-access-lwg7f\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-service-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135786 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135786 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-tmp\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135786 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cnc9\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135786 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-snapshots\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.135980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.135980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.135914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.136323 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.136305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1380f1d3-60b6-4093-a429-bd909b4729e8-snapshots\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.136962 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.136938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1380f1d3-60b6-4093-a429-bd909b4729e8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.138732 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.138704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1380f1d3-60b6-4093-a429-bd909b4729e8-serving-cert\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.144416 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.144395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwg7f\" (UniqueName: \"kubernetes.io/projected/1380f1d3-60b6-4093-a429-bd909b4729e8-kube-api-access-lwg7f\") pod \"insights-operator-585dfdc468-k72wm\" (UID: \"1380f1d3-60b6-4093-a429-bd909b4729e8\") " pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.149227 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.149197 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" Apr 20 16:25:33.156841 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.156818 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" Apr 20 16:25:33.171234 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.171198 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xv6j6"] Apr 20 16:25:33.174098 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:33.174069 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd4906_e010_4e9e_89d5_6017138ff6a9.slice/crio-5d4e34efc1fe296a35daeeab2dd6ab6b49cf7dbfcc485603bcb05d6221869d2f WatchSource:0}: Error finding container 5d4e34efc1fe296a35daeeab2dd6ab6b49cf7dbfcc485603bcb05d6221869d2f: Status 404 returned error can't find the container with id 5d4e34efc1fe296a35daeeab2dd6ab6b49cf7dbfcc485603bcb05d6221869d2f Apr 20 16:25:33.237041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.236911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.236976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cnc9\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237036 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237486 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237930 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.237614 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:25:33.237930 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.237635 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56c6c65dfd-rtc66: secret "image-registry-tls" not found Apr 20 16:25:33.237930 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.237728 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls podName:e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:33.737708511 +0000 UTC m=+138.317278521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls") pod "image-registry-56c6c65dfd-rtc66" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74") : secret "image-registry-tls" not found Apr 20 16:25:33.237930 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.237930 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.237876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.240276 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.240255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.240505 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.240481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.247093 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.247037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.247443 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.247418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cnc9\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.259900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.259869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k72wm" Apr 20 16:25:33.273022 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.272989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd"] Apr 20 16:25:33.276274 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:33.276249 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf739da69_7df9_40a8_8c4e_cef36ba94452.slice/crio-54574f308ea3f980984e29fa19dd0a92f1295192e4f19159fd2fb2b14ab0c55e WatchSource:0}: Error finding container 54574f308ea3f980984e29fa19dd0a92f1295192e4f19159fd2fb2b14ab0c55e: Status 404 returned error can't find the container with id 54574f308ea3f980984e29fa19dd0a92f1295192e4f19159fd2fb2b14ab0c55e Apr 20 16:25:33.289426 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.289357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz"] Apr 20 16:25:33.295431 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:33.295399 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf014e64_9dcb_485c_b88d_8e87419d5399.slice/crio-552be08273c1000dc52daf4a79891862fca5ecddd6e02ea0c8dc5472c5cacba7 WatchSource:0}: Error finding container 552be08273c1000dc52daf4a79891862fca5ecddd6e02ea0c8dc5472c5cacba7: Status 404 returned error can't find the container with id 552be08273c1000dc52daf4a79891862fca5ecddd6e02ea0c8dc5472c5cacba7 Apr 20 16:25:33.353809 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.353781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" event={"ID":"af014e64-9dcb-485c-b88d-8e87419d5399","Type":"ContainerStarted","Data":"552be08273c1000dc52daf4a79891862fca5ecddd6e02ea0c8dc5472c5cacba7"} Apr 20 16:25:33.354816 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.354787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" event={"ID":"f739da69-7df9-40a8-8c4e-cef36ba94452","Type":"ContainerStarted","Data":"54574f308ea3f980984e29fa19dd0a92f1295192e4f19159fd2fb2b14ab0c55e"} Apr 20 16:25:33.355866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.355831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" event={"ID":"1ddd4906-e010-4e9e-89d5-6017138ff6a9","Type":"ContainerStarted","Data":"5d4e34efc1fe296a35daeeab2dd6ab6b49cf7dbfcc485603bcb05d6221869d2f"} Apr 20 16:25:33.383973 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.383938 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k72wm"] Apr 20 16:25:33.387898 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:33.387871 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1380f1d3_60b6_4093_a429_bd909b4729e8.slice/crio-0bdc8be98f8f1841b5be0f5a03bed55eb3a799e8a6b65a0e2f05795344c5e58e WatchSource:0}: Error finding container 0bdc8be98f8f1841b5be0f5a03bed55eb3a799e8a6b65a0e2f05795344c5e58e: Status 404 returned error can't find the container with id 0bdc8be98f8f1841b5be0f5a03bed55eb3a799e8a6b65a0e2f05795344c5e58e Apr 20 16:25:33.740667 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:33.740634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:33.741142 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.740782 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:25:33.741142 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.740803 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56c6c65dfd-rtc66: secret "image-registry-tls" not found Apr 20 16:25:33.741142 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:33.740855 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls podName:e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:34.740840466 +0000 UTC m=+139.320410464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls") pod "image-registry-56c6c65dfd-rtc66" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74") : secret "image-registry-tls" not found Apr 20 16:25:34.359824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:34.359771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" event={"ID":"af014e64-9dcb-485c-b88d-8e87419d5399","Type":"ContainerStarted","Data":"660edc7980266094e16e6774867bcb5660f66492986146170fb7638bb8f112f7"} Apr 20 16:25:34.361077 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:34.361047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k72wm" event={"ID":"1380f1d3-60b6-4093-a429-bd909b4729e8","Type":"ContainerStarted","Data":"0bdc8be98f8f1841b5be0f5a03bed55eb3a799e8a6b65a0e2f05795344c5e58e"} Apr 20 16:25:34.375348 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:34.375286 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vwzmz" podStartSLOduration=2.3752699809999998 podStartE2EDuration="2.375269981s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:25:34.374393606 +0000 UTC m=+138.953963625" watchObservedRunningTime="2026-04-20 16:25:34.375269981 +0000 UTC m=+138.954840000" Apr 20 16:25:34.749330 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:34.749296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:34.749955 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:34.749459 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:25:34.749955 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:34.749476 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56c6c65dfd-rtc66: secret "image-registry-tls" not found Apr 20 16:25:34.749955 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:34.749541 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls podName:e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:36.749517873 +0000 UTC m=+141.329087888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls") pod "image-registry-56c6c65dfd-rtc66" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74") : secret "image-registry-tls" not found Apr 20 16:25:36.768417 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:36.768319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:36.768887 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:36.768487 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:25:36.768887 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:36.768512 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56c6c65dfd-rtc66: secret "image-registry-tls" not found Apr 20 16:25:36.768887 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:36.768588 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls podName:e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:40.768567077 +0000 UTC m=+145.348137072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls") pod "image-registry-56c6c65dfd-rtc66" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74") : secret "image-registry-tls" not found Apr 20 16:25:37.369579 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.369539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" event={"ID":"f739da69-7df9-40a8-8c4e-cef36ba94452","Type":"ContainerStarted","Data":"b97eb6efbb6404e9caa81d44eecec04321df2226f29d9a87e972f5fecb9d3b33"} Apr 20 16:25:37.370838 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.370814 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k72wm" event={"ID":"1380f1d3-60b6-4093-a429-bd909b4729e8","Type":"ContainerStarted","Data":"fad147ce1aabe2de6fb068692ddcb5b61e708cf8aef73e80a4b0b18b43680b60"} Apr 20 16:25:37.372288 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.372268 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/0.log" Apr 20 16:25:37.372385 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.372306 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ddd4906-e010-4e9e-89d5-6017138ff6a9" containerID="8da49e0916a34b5c47f1864be1a26cfb325d58d6c2137aeb887453c46c23c76f" exitCode=255 Apr 20 16:25:37.372385 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.372333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" event={"ID":"1ddd4906-e010-4e9e-89d5-6017138ff6a9","Type":"ContainerDied","Data":"8da49e0916a34b5c47f1864be1a26cfb325d58d6c2137aeb887453c46c23c76f"} Apr 20 16:25:37.372506 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.372492 2577 scope.go:117] "RemoveContainer" containerID="8da49e0916a34b5c47f1864be1a26cfb325d58d6c2137aeb887453c46c23c76f" Apr 20 16:25:37.387103 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.387061 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" podStartSLOduration=2.206834362 podStartE2EDuration="5.387046756s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:33.27851592 +0000 UTC m=+137.858085914" lastFinishedPulling="2026-04-20 16:25:36.458728305 +0000 UTC m=+141.038298308" observedRunningTime="2026-04-20 16:25:37.38606499 +0000 UTC m=+141.965635006" watchObservedRunningTime="2026-04-20 16:25:37.387046756 +0000 UTC m=+141.966616771" Apr 20 16:25:37.423823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.423777 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-k72wm" podStartSLOduration=2.350197169 podStartE2EDuration="5.423762528s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:33.389941452 +0000 UTC m=+137.969511445" lastFinishedPulling="2026-04-20 16:25:36.463506811 +0000 UTC m=+141.043076804" observedRunningTime="2026-04-20 16:25:37.423604356 +0000 UTC m=+142.003174372" watchObservedRunningTime="2026-04-20 16:25:37.423762528 +0000 UTC m=+142.003332546" Apr 20 16:25:37.606984 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.606947 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm"] Apr 20 16:25:37.610399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.610370 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" Apr 20 16:25:37.613016 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.612996 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 16:25:37.613016 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.613012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-dkpjv\"" Apr 20 16:25:37.613285 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.613271 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 16:25:37.619662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.619612 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm"] Apr 20 16:25:37.777199 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.777154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjn8\" (UniqueName: \"kubernetes.io/projected/88fe29fc-0334-4909-aee2-527d4cd4b89e-kube-api-access-2gjn8\") pod \"migrator-74bb7799d9-9r2xm\" (UID: \"88fe29fc-0334-4909-aee2-527d4cd4b89e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" Apr 20 16:25:37.878116 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.878089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjn8\" (UniqueName: \"kubernetes.io/projected/88fe29fc-0334-4909-aee2-527d4cd4b89e-kube-api-access-2gjn8\") pod \"migrator-74bb7799d9-9r2xm\" (UID: \"88fe29fc-0334-4909-aee2-527d4cd4b89e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" Apr 20 16:25:37.887722 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.887669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjn8\" (UniqueName: \"kubernetes.io/projected/88fe29fc-0334-4909-aee2-527d4cd4b89e-kube-api-access-2gjn8\") pod \"migrator-74bb7799d9-9r2xm\" (UID: \"88fe29fc-0334-4909-aee2-527d4cd4b89e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" Apr 20 16:25:37.919621 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:37.919589 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" Apr 20 16:25:38.033895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.033868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm"] Apr 20 16:25:38.037310 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:38.037282 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88fe29fc_0334_4909_aee2_527d4cd4b89e.slice/crio-91f23834ed9ecb8f2325fc33d780f6ff80a947ab2dafb15c0c22ed8320224f1e WatchSource:0}: Error finding container 91f23834ed9ecb8f2325fc33d780f6ff80a947ab2dafb15c0c22ed8320224f1e: Status 404 returned error can't find the container with id 91f23834ed9ecb8f2325fc33d780f6ff80a947ab2dafb15c0c22ed8320224f1e Apr 20 16:25:38.375983 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.375948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" event={"ID":"88fe29fc-0334-4909-aee2-527d4cd4b89e","Type":"ContainerStarted","Data":"91f23834ed9ecb8f2325fc33d780f6ff80a947ab2dafb15c0c22ed8320224f1e"} Apr 20 16:25:38.377092 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:25:38.377420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/0.log" Apr 20 16:25:38.377473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377438 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ddd4906-e010-4e9e-89d5-6017138ff6a9" containerID="54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22" exitCode=255 Apr 20 16:25:38.377520 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" event={"ID":"1ddd4906-e010-4e9e-89d5-6017138ff6a9","Type":"ContainerDied","Data":"54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22"} Apr 20 16:25:38.377520 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377510 2577 scope.go:117] "RemoveContainer" containerID="8da49e0916a34b5c47f1864be1a26cfb325d58d6c2137aeb887453c46c23c76f" Apr 20 16:25:38.377765 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:38.377750 2577 scope.go:117] "RemoveContainer" containerID="54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22" Apr 20 16:25:38.377971 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:38.377944 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xv6j6_openshift-console-operator(1ddd4906-e010-4e9e-89d5-6017138ff6a9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" podUID="1ddd4906-e010-4e9e-89d5-6017138ff6a9" Apr 20 16:25:39.381421 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:39.381401 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:25:39.381781 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:39.381712 2577 scope.go:117] "RemoveContainer" containerID="54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22" Apr 20 16:25:39.381901 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:39.381885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xv6j6_openshift-console-operator(1ddd4906-e010-4e9e-89d5-6017138ff6a9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" podUID="1ddd4906-e010-4e9e-89d5-6017138ff6a9" Apr 20 16:25:40.386025 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:40.385990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" event={"ID":"88fe29fc-0334-4909-aee2-527d4cd4b89e","Type":"ContainerStarted","Data":"eca14fd399560a0c50f8f6b380c257139eeaf8cca6bfe682d985d37aa540dbbc"} Apr 20 16:25:40.386025 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:40.386028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" event={"ID":"88fe29fc-0334-4909-aee2-527d4cd4b89e","Type":"ContainerStarted","Data":"1587f579574d469f50ad077da5f56dca5561fc3ee1e2229eb176fb9f4f8457ed"} Apr 20 16:25:40.402272 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:40.402225 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9r2xm" podStartSLOduration=2.073162015 podStartE2EDuration="3.402210105s" podCreationTimestamp="2026-04-20 16:25:37 +0000 UTC" firstStartedPulling="2026-04-20 16:25:38.039598314 +0000 UTC m=+142.619168311" lastFinishedPulling="2026-04-20 16:25:39.368646401 +0000 UTC m=+143.948216401" observedRunningTime="2026-04-20 16:25:40.401385064 +0000 UTC m=+144.980955078" watchObservedRunningTime="2026-04-20 16:25:40.402210105 +0000 UTC m=+144.981780121" Apr 20 16:25:40.626079 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:40.626056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pzfs2_e8733069-fadf-4af4-a36d-4e7f085cc317/dns-node-resolver/0.log" Apr 20 16:25:40.800980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:40.800902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:40.801118 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:40.801048 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:25:40.801118 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:40.801064 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56c6c65dfd-rtc66: secret "image-registry-tls" not found Apr 20 16:25:40.801118 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:40.801114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls podName:e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:48.801100914 +0000 UTC m=+153.380670907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls") pod "image-registry-56c6c65dfd-rtc66" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74") : secret "image-registry-tls" not found Apr 20 16:25:41.626870 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:41.626844 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jw965_a8c37028-00dc-4ae4-9e33-7af134c543da/node-ca/0.log" Apr 20 16:25:42.826748 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:42.826722 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9r2xm_88fe29fc-0334-4909-aee2-527d4cd4b89e/migrator/0.log" Apr 20 16:25:43.027048 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:43.027016 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9r2xm_88fe29fc-0334-4909-aee2-527d4cd4b89e/graceful-termination/0.log" Apr 20 16:25:43.052074 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:43.052052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:43.052173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:43.052083 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:43.052391 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:43.052381 2577 scope.go:117] "RemoveContainer" containerID="54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22" Apr 20 16:25:43.052544 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:43.052529 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xv6j6_openshift-console-operator(1ddd4906-e010-4e9e-89d5-6017138ff6a9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" podUID="1ddd4906-e010-4e9e-89d5-6017138ff6a9" Apr 20 16:25:48.865482 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:48.865452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:48.867882 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:48.867853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"image-registry-56c6c65dfd-rtc66\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:48.965971 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:48.965942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:49.085512 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:49.085487 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:25:49.087457 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:49.087431 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5087d1c_edc4_4d05_9d3d_8d29d7a2bc74.slice/crio-b62273781032ee65f1c9858417746f7820d75d23d36dfc4ed55cfa118b6daec4 WatchSource:0}: Error finding container b62273781032ee65f1c9858417746f7820d75d23d36dfc4ed55cfa118b6daec4: Status 404 returned error can't find the container with id b62273781032ee65f1c9858417746f7820d75d23d36dfc4ed55cfa118b6daec4 Apr 20 16:25:49.410315 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:49.410278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" event={"ID":"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74","Type":"ContainerStarted","Data":"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa"} Apr 20 16:25:49.410315 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:49.410314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" event={"ID":"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74","Type":"ContainerStarted","Data":"b62273781032ee65f1c9858417746f7820d75d23d36dfc4ed55cfa118b6daec4"} Apr 20 16:25:49.410534 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:49.410402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:25:49.431969 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:49.431921 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" podStartSLOduration=16.431908606 podStartE2EDuration="16.431908606s" podCreationTimestamp="2026-04-20 16:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:25:49.43158361 +0000 UTC m=+154.011153627" watchObservedRunningTime="2026-04-20 16:25:49.431908606 +0000 UTC m=+154.011478621" Apr 20 16:25:52.324096 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:52.324053 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7j9p9" podUID="f87ef63b-de21-49e4-89ee-c732444e83a3" Apr 20 16:25:52.333242 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:52.333210 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xqtvp" podUID="66ddeaa0-b37e-4d1b-8043-b74a8bb883a8" Apr 20 16:25:52.416664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:52.416638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j9p9" Apr 20 16:25:54.012763 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:25:54.012728 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mpvsq" podUID="ec42a0e4-ff1e-48d5-8b45-fab851d223a4" Apr 20 16:25:56.008929 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.008901 2577 scope.go:117] "RemoveContainer" containerID="54cbc01377fd26f6b161fd5fc871af2c6e42c3cf1693abd2c0140bbd3ce85d22" Apr 20 16:25:56.427216 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.427189 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:25:56.427375 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.427271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" event={"ID":"1ddd4906-e010-4e9e-89d5-6017138ff6a9","Type":"ContainerStarted","Data":"d777cd4ef26c2f9c252027722d3e5dac7e5d56d903dae347b90cc9f4f2a4c596"} Apr 20 16:25:56.427626 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.427601 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:56.444664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.444619 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" podStartSLOduration=21.159127895 podStartE2EDuration="24.444608034s" podCreationTimestamp="2026-04-20 16:25:32 +0000 UTC" firstStartedPulling="2026-04-20 16:25:33.175912792 +0000 UTC m=+137.755482787" lastFinishedPulling="2026-04-20 16:25:36.461392926 +0000 UTC m=+141.040962926" observedRunningTime="2026-04-20 16:25:56.443745109 +0000 UTC m=+161.023315126" watchObservedRunningTime="2026-04-20 16:25:56.444608034 +0000 UTC m=+161.024178049" Apr 20 16:25:56.969142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:56.969114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-xv6j6" Apr 20 16:25:57.233718 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.233624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:25:57.233718 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.233692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:25:57.236078 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.236039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f87ef63b-de21-49e4-89ee-c732444e83a3-metrics-tls\") pod \"dns-default-7j9p9\" (UID: \"f87ef63b-de21-49e4-89ee-c732444e83a3\") " pod="openshift-dns/dns-default-7j9p9" Apr 20 16:25:57.236243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.236224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ddeaa0-b37e-4d1b-8043-b74a8bb883a8-cert\") pod \"ingress-canary-xqtvp\" (UID: \"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8\") " pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:25:57.520461 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.520394 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:25:57.528502 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.528485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j9p9" Apr 20 16:25:57.643219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:57.643189 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j9p9"] Apr 20 16:25:57.646549 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:25:57.646520 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf87ef63b_de21_49e4_89ee_c732444e83a3.slice/crio-58658490a93c92194de0546e975bfe732f7fdd99073068860fddc376bf210952 WatchSource:0}: Error finding container 58658490a93c92194de0546e975bfe732f7fdd99073068860fddc376bf210952: Status 404 returned error can't find the container with id 58658490a93c92194de0546e975bfe732f7fdd99073068860fddc376bf210952 Apr 20 16:25:58.432838 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:58.432802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j9p9" event={"ID":"f87ef63b-de21-49e4-89ee-c732444e83a3","Type":"ContainerStarted","Data":"58658490a93c92194de0546e975bfe732f7fdd99073068860fddc376bf210952"} Apr 20 16:25:59.440574 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:59.440539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j9p9" event={"ID":"f87ef63b-de21-49e4-89ee-c732444e83a3","Type":"ContainerStarted","Data":"f4d26a09894042625b4dc60b2c67dde2a0c0cac45f0b1e0ab7a27c16b84bd64c"} Apr 20 16:25:59.440574 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:59.440576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j9p9" event={"ID":"f87ef63b-de21-49e4-89ee-c732444e83a3","Type":"ContainerStarted","Data":"07490800fbbce9d5ee70d2c88747b3feeb85cd93d9fdeb9c355e488303c953ba"} Apr 20 16:25:59.440995 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:59.440720 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7j9p9" Apr 20 16:25:59.457248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:25:59.457207 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7j9p9" podStartSLOduration=129.2840177 podStartE2EDuration="2m10.457196481s" podCreationTimestamp="2026-04-20 16:23:49 +0000 UTC" firstStartedPulling="2026-04-20 16:25:57.648341999 +0000 UTC m=+162.227911993" lastFinishedPulling="2026-04-20 16:25:58.821520774 +0000 UTC m=+163.401090774" observedRunningTime="2026-04-20 16:25:59.456956577 +0000 UTC m=+164.036526605" watchObservedRunningTime="2026-04-20 16:25:59.457196481 +0000 UTC m=+164.036766496" Apr 20 16:26:01.177269 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.177235 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq"] Apr 20 16:26:01.180182 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.180167 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.184030 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.184008 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 16:26:01.184030 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.184026 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-sblmz\"" Apr 20 16:26:01.184275 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.184048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 16:26:01.190625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.190605 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq"] Apr 20 16:26:01.243922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.243888 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6b6w7"] Apr 20 16:26:01.247134 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.247108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.249891 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.249871 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 16:26:01.250147 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.250133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 16:26:01.250209 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.250168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lgcvb\"" Apr 20 16:26:01.251980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.251958 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:26:01.258758 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.258736 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6b6w7"] Apr 20 16:26:01.263155 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.263128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc489140-89cc-484b-853e-17ce8319d94f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.263294 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.263257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bc489140-89cc-484b-853e-17ce8319d94f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.289127 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.289099 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cbcb584c5-w4vng"] Apr 20 16:26:01.292213 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.292191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.302972 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.302951 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cbcb584c5-w4vng"] Apr 20 16:26:01.363830 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-image-registry-private-configuration\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.363959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-bound-sa-token\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.363959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-installation-pull-secrets\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.363959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd824f8e-bc79-41e3-afab-13765cfa09ae-crio-socket\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.364070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbwb\" (UniqueName: \"kubernetes.io/projected/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-api-access-9nbwb\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.364070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.363995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz99d\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-kube-api-access-tz99d\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.364070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.364070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-trusted-ca\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.364070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13d1688c-f90d-4062-b1af-16dc32e62dba-ca-trust-extracted\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.364206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd824f8e-bc79-41e3-afab-13765cfa09ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.364206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd824f8e-bc79-41e3-afab-13765cfa09ae-data-volume\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.364206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-tls\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.364206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bc489140-89cc-484b-853e-17ce8319d94f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.364316 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-certificates\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.364316 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc489140-89cc-484b-853e-17ce8319d94f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.365009 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.364934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc489140-89cc-484b-853e-17ce8319d94f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.366718 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.366699 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bc489140-89cc-484b-853e-17ce8319d94f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7xrq\" (UID: \"bc489140-89cc-484b-853e-17ce8319d94f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.464810 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-image-registry-private-configuration\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.464810 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-bound-sa-token\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.464810 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-installation-pull-secrets\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd824f8e-bc79-41e3-afab-13765cfa09ae-crio-socket\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbwb\" (UniqueName: \"kubernetes.io/projected/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-api-access-9nbwb\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz99d\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-kube-api-access-tz99d\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd824f8e-bc79-41e3-afab-13765cfa09ae-crio-socket\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.464978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-trusted-ca\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13d1688c-f90d-4062-b1af-16dc32e62dba-ca-trust-extracted\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465459 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd824f8e-bc79-41e3-afab-13765cfa09ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465459 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd824f8e-bc79-41e3-afab-13765cfa09ae-data-volume\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465459 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-tls\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465459 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-certificates\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.465706 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd824f8e-bc79-41e3-afab-13765cfa09ae-data-volume\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.465929 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.465906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13d1688c-f90d-4062-b1af-16dc32e62dba-ca-trust-extracted\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.466140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.466123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-trusted-ca\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.466360 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.466315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-certificates\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.466496 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.466473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.467460 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.467441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-installation-pull-secrets\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.467552 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.467510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/13d1688c-f90d-4062-b1af-16dc32e62dba-image-registry-private-configuration\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.467634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.467612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd824f8e-bc79-41e3-afab-13765cfa09ae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.468037 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.468018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-registry-tls\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.472669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.472644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-bound-sa-token\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.472972 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.472955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz99d\" (UniqueName: \"kubernetes.io/projected/13d1688c-f90d-4062-b1af-16dc32e62dba-kube-api-access-tz99d\") pod \"image-registry-5cbcb584c5-w4vng\" (UID: \"13d1688c-f90d-4062-b1af-16dc32e62dba\") " pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.473307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.473283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbwb\" (UniqueName: \"kubernetes.io/projected/cd824f8e-bc79-41e3-afab-13765cfa09ae-kube-api-access-9nbwb\") pod \"insights-runtime-extractor-6b6w7\" (UID: \"cd824f8e-bc79-41e3-afab-13765cfa09ae\") " pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.488770 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.488751 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" Apr 20 16:26:01.557070 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.557044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6b6w7" Apr 20 16:26:01.600871 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.600844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:01.609522 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.609456 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq"] Apr 20 16:26:01.613912 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:01.613880 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc489140_89cc_484b_853e_17ce8319d94f.slice/crio-02c30b04a3a148d53d001105bd0da0225b545b57774784c66b93472c7b11324a WatchSource:0}: Error finding container 02c30b04a3a148d53d001105bd0da0225b545b57774784c66b93472c7b11324a: Status 404 returned error can't find the container with id 02c30b04a3a148d53d001105bd0da0225b545b57774784c66b93472c7b11324a Apr 20 16:26:01.685440 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.685389 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6b6w7"] Apr 20 16:26:01.690221 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:01.690178 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd824f8e_bc79_41e3_afab_13765cfa09ae.slice/crio-6851c164084eb6581ca8844c2265274e204a6bc24dfde076dc4aebb4e65ac240 WatchSource:0}: Error finding container 6851c164084eb6581ca8844c2265274e204a6bc24dfde076dc4aebb4e65ac240: Status 404 returned error can't find the container with id 6851c164084eb6581ca8844c2265274e204a6bc24dfde076dc4aebb4e65ac240 Apr 20 16:26:01.733017 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:01.732864 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cbcb584c5-w4vng"] Apr 20 16:26:01.735958 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:01.735935 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d1688c_f90d_4062_b1af_16dc32e62dba.slice/crio-8348d17c7f78780d2b47109241b64f956d352aa707ef3495930713093027c67c WatchSource:0}: Error finding container 8348d17c7f78780d2b47109241b64f956d352aa707ef3495930713093027c67c: Status 404 returned error can't find the container with id 8348d17c7f78780d2b47109241b64f956d352aa707ef3495930713093027c67c Apr 20 16:26:02.450542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.450504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6b6w7" event={"ID":"cd824f8e-bc79-41e3-afab-13765cfa09ae","Type":"ContainerStarted","Data":"758a35e1e1c83f7c30f2dd4ffaa0eeb8ee3691ebacc83c8813af59a8ccfd15dc"} Apr 20 16:26:02.450542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.450545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6b6w7" event={"ID":"cd824f8e-bc79-41e3-afab-13765cfa09ae","Type":"ContainerStarted","Data":"6851c164084eb6581ca8844c2265274e204a6bc24dfde076dc4aebb4e65ac240"} Apr 20 16:26:02.451527 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.451502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" event={"ID":"bc489140-89cc-484b-853e-17ce8319d94f","Type":"ContainerStarted","Data":"02c30b04a3a148d53d001105bd0da0225b545b57774784c66b93472c7b11324a"} Apr 20 16:26:02.452858 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.452835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" event={"ID":"13d1688c-f90d-4062-b1af-16dc32e62dba","Type":"ContainerStarted","Data":"362cc17b347b9626c0e58412c9f06bdb65529548854dca2b337b553bc51fb77f"} Apr 20 16:26:02.452952 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.452866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" event={"ID":"13d1688c-f90d-4062-b1af-16dc32e62dba","Type":"ContainerStarted","Data":"8348d17c7f78780d2b47109241b64f956d352aa707ef3495930713093027c67c"} Apr 20 16:26:02.453037 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.453023 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:02.495140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:02.495088 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" podStartSLOduration=1.4950687839999999 podStartE2EDuration="1.495068784s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:26:02.493463933 +0000 UTC m=+167.073033973" watchObservedRunningTime="2026-04-20 16:26:02.495068784 +0000 UTC m=+167.074638800" Apr 20 16:26:03.457148 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:03.457111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6b6w7" event={"ID":"cd824f8e-bc79-41e3-afab-13765cfa09ae","Type":"ContainerStarted","Data":"d387a98dd8b09eb8cb0a38a37874a8d1244c1be2e80aa475ecf68ff93e55f77f"} Apr 20 16:26:03.458634 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:03.458544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" event={"ID":"bc489140-89cc-484b-853e-17ce8319d94f","Type":"ContainerStarted","Data":"592baa979091cb7c98c2fb7e7425758179c57dbe59d9d17d4967bfd56f34e481"} Apr 20 16:26:03.473887 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:03.473846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7xrq" podStartSLOduration=1.613488888 podStartE2EDuration="2.473834148s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="2026-04-20 16:26:01.617984777 +0000 UTC m=+166.197554784" lastFinishedPulling="2026-04-20 16:26:02.478330043 +0000 UTC m=+167.057900044" observedRunningTime="2026-04-20 16:26:03.473188558 +0000 UTC m=+168.052758597" watchObservedRunningTime="2026-04-20 16:26:03.473834148 +0000 UTC m=+168.053404162" Apr 20 16:26:04.463246 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:04.463213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6b6w7" event={"ID":"cd824f8e-bc79-41e3-afab-13765cfa09ae","Type":"ContainerStarted","Data":"7efcb66ecf68268eb0f1dec7582d96b2591d640646dbb07218a8699f83fb56ae"} Apr 20 16:26:04.481900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:04.481854 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6b6w7" podStartSLOduration=1.245914437 podStartE2EDuration="3.481801851s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="2026-04-20 16:26:01.753211499 +0000 UTC m=+166.332781493" lastFinishedPulling="2026-04-20 16:26:03.989098913 +0000 UTC m=+168.568668907" observedRunningTime="2026-04-20 16:26:04.480294761 +0000 UTC m=+169.059864776" watchObservedRunningTime="2026-04-20 16:26:04.481801851 +0000 UTC m=+169.061371869" Apr 20 16:26:05.003057 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:05.003007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:26:05.005900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:05.005876 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:26:05.013958 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:05.013937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xqtvp" Apr 20 16:26:05.134310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:05.134282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xqtvp"] Apr 20 16:26:05.137534 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:05.137507 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ddeaa0_b37e_4d1b_8043_b74a8bb883a8.slice/crio-3e8a78bbbd06d0b4b3749b1b8965c2ac03fb9a8757a0e09ba4df802f41048a9f WatchSource:0}: Error finding container 3e8a78bbbd06d0b4b3749b1b8965c2ac03fb9a8757a0e09ba4df802f41048a9f: Status 404 returned error can't find the container with id 3e8a78bbbd06d0b4b3749b1b8965c2ac03fb9a8757a0e09ba4df802f41048a9f Apr 20 16:26:05.470769 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:05.470714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xqtvp" event={"ID":"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8","Type":"ContainerStarted","Data":"3e8a78bbbd06d0b4b3749b1b8965c2ac03fb9a8757a0e09ba4df802f41048a9f"} Apr 20 16:26:07.003076 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:07.003040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:26:07.477040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:07.477001 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xqtvp" event={"ID":"66ddeaa0-b37e-4d1b-8043-b74a8bb883a8","Type":"ContainerStarted","Data":"01fc14e10a7bb2debb57e23d600efa354b93e7e08a08b5fab740aa69cc927771"} Apr 20 16:26:09.445876 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:09.445842 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7j9p9" Apr 20 16:26:09.461955 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:09.461898 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xqtvp" podStartSLOduration=138.82364534 podStartE2EDuration="2m20.461880634s" podCreationTimestamp="2026-04-20 16:23:49 +0000 UTC" firstStartedPulling="2026-04-20 16:26:05.139425671 +0000 UTC m=+169.718995669" lastFinishedPulling="2026-04-20 16:26:06.777660967 +0000 UTC m=+171.357230963" observedRunningTime="2026-04-20 16:26:07.493143357 +0000 UTC m=+172.072713385" watchObservedRunningTime="2026-04-20 16:26:09.461880634 +0000 UTC m=+174.041450650" Apr 20 16:26:11.257238 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:11.257209 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:26:16.846499 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.846462 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:26:16.851156 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.851135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.853837 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.853814 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 16:26:16.853922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.853880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 16:26:16.853922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.853906 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 16:26:16.855449 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.855425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 16:26:16.855536 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.855446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 16:26:16.855536 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.855527 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mqch4\"" Apr 20 16:26:16.855618 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.855429 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 16:26:16.855918 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.855906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 16:26:16.860640 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.860618 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 16:26:16.860815 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.860798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:26:16.995975 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.995942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.995975 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.995981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.996181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.996004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.996181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.996095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.996181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.996124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.996181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.996155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55bq\" (UniqueName: \"kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:16.996318 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:16.996199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097194 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097194 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097162 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j55bq\" (UniqueName: \"kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.097906 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.097879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.098190 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.098017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.098190 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.098088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.098190 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.098137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.099721 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.099670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.099888 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.099868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.105671 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.105651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55bq\" (UniqueName: \"kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq\") pod \"console-7b955679d8-5h54x\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.160587 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.160562 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:17.276292 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.276260 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:26:17.280570 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:17.280542 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3eee4c1_6627_470c_8e02_fa958abe4e96.slice/crio-786604f446b994aa55e6e6b633460367e9bcc2de8640b5c556c9bf71120f1cf3 WatchSource:0}: Error finding container 786604f446b994aa55e6e6b633460367e9bcc2de8640b5c556c9bf71120f1cf3: Status 404 returned error can't find the container with id 786604f446b994aa55e6e6b633460367e9bcc2de8640b5c556c9bf71120f1cf3 Apr 20 16:26:17.502721 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:17.502670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b955679d8-5h54x" event={"ID":"b3eee4c1-6627-470c-8e02-fa958abe4e96","Type":"ContainerStarted","Data":"786604f446b994aa55e6e6b633460367e9bcc2de8640b5c556c9bf71120f1cf3"} Apr 20 16:26:20.512144 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:20.512108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b955679d8-5h54x" event={"ID":"b3eee4c1-6627-470c-8e02-fa958abe4e96","Type":"ContainerStarted","Data":"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae"} Apr 20 16:26:20.529895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:20.529846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b955679d8-5h54x" podStartSLOduration=1.872641952 podStartE2EDuration="4.529833179s" podCreationTimestamp="2026-04-20 16:26:16 +0000 UTC" firstStartedPulling="2026-04-20 16:26:17.282508269 +0000 UTC m=+181.862078266" lastFinishedPulling="2026-04-20 16:26:19.939699497 +0000 UTC m=+184.519269493" observedRunningTime="2026-04-20 16:26:20.529652744 +0000 UTC m=+185.109222759" watchObservedRunningTime="2026-04-20 16:26:20.529833179 +0000 UTC m=+185.109403191" Apr 20 16:26:21.748477 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.748438 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w"] Apr 20 16:26:21.751762 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.751744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.754500 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.754472 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 16:26:21.754911 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.754888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 16:26:21.755023 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.755006 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 16:26:21.755097 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.755007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 16:26:21.755197 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.755175 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nbjkl"] Apr 20 16:26:21.755292 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.755277 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 16:26:21.755339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.755280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qpgk9\"" Apr 20 16:26:21.759655 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.759639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.762171 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.762154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 16:26:21.762272 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.762257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 16:26:21.762409 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.762394 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 16:26:21.762576 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.762557 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pd26p\"" Apr 20 16:26:21.763805 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.763786 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w"] Apr 20 16:26:21.837844 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-accelerators-collector-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.837844 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837845 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-metrics-client-ca\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-root\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgcq2\" (UniqueName: \"kubernetes.io/projected/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-kube-api-access-tgcq2\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.837992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-wtmp\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-textfile\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-sys\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.838339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkxn\" (UniqueName: \"kubernetes.io/projected/1fdf2c2e-403d-4950-aec2-06e63346304a-kube-api-access-fnkxn\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.838339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.838339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.838209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fdf2c2e-403d-4950-aec2-06e63346304a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.939080 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkxn\" (UniqueName: \"kubernetes.io/projected/1fdf2c2e-403d-4950-aec2-06e63346304a-kube-api-access-fnkxn\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.939248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.939248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fdf2c2e-403d-4950-aec2-06e63346304a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.939248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-accelerators-collector-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-metrics-client-ca\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-root\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgcq2\" (UniqueName: \"kubernetes.io/projected/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-kube-api-access-tgcq2\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-root\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.939473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-wtmp\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-textfile\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-sys\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-sys\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:26:21.939716 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:26:21.939783 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls podName:7a5945ad-40e0-4bf0-9712-79d17a1c8d00 nodeName:}" failed. No retries permitted until 2026-04-20 16:26:22.439763659 +0000 UTC m=+187.019333653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls") pod "node-exporter-nbjkl" (UID: "7a5945ad-40e0-4bf0-9712-79d17a1c8d00") : secret "node-exporter-tls" not found Apr 20 16:26:21.939822 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-wtmp\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.940099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-accelerators-collector-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.940099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-metrics-client-ca\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.940099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.939929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fdf2c2e-403d-4950-aec2-06e63346304a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.940099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.940035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-textfile\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.942044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.942022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:21.942387 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.942367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.942438 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.942424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fdf2c2e-403d-4950-aec2-06e63346304a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.950305 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.950286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkxn\" (UniqueName: \"kubernetes.io/projected/1fdf2c2e-403d-4950-aec2-06e63346304a-kube-api-access-fnkxn\") pod \"openshift-state-metrics-9d44df66c-g4x2w\" (UID: \"1fdf2c2e-403d-4950-aec2-06e63346304a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:21.955329 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:21.955300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgcq2\" (UniqueName: \"kubernetes.io/projected/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-kube-api-access-tgcq2\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:22.061630 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.061552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" Apr 20 16:26:22.185655 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.185619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w"] Apr 20 16:26:22.188551 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:22.188521 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fdf2c2e_403d_4950_aec2_06e63346304a.slice/crio-8f4dfd3b4542071b67b734736a50fe5033ee9365840a5ab94893d85ab76a5856 WatchSource:0}: Error finding container 8f4dfd3b4542071b67b734736a50fe5033ee9365840a5ab94893d85ab76a5856: Status 404 returned error can't find the container with id 8f4dfd3b4542071b67b734736a50fe5033ee9365840a5ab94893d85ab76a5856 Apr 20 16:26:22.444041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.444009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:22.446306 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.446279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a5945ad-40e0-4bf0-9712-79d17a1c8d00-node-exporter-tls\") pod \"node-exporter-nbjkl\" (UID: \"7a5945ad-40e0-4bf0-9712-79d17a1c8d00\") " pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:22.519021 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.518985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" event={"ID":"1fdf2c2e-403d-4950-aec2-06e63346304a","Type":"ContainerStarted","Data":"f25f0787a049134307d2b2dac5c196e6135b87c5c6fbe49c4804101a7b6a3b62"} Apr 20 16:26:22.519021 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.519023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" event={"ID":"1fdf2c2e-403d-4950-aec2-06e63346304a","Type":"ContainerStarted","Data":"87e440f47ef11f88662cb54aed148b6f4caff4983e614ff7d2e51d59d194f729"} Apr 20 16:26:22.519192 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.519042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" event={"ID":"1fdf2c2e-403d-4950-aec2-06e63346304a","Type":"ContainerStarted","Data":"8f4dfd3b4542071b67b734736a50fe5033ee9365840a5ab94893d85ab76a5856"} Apr 20 16:26:22.668802 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:22.668774 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nbjkl" Apr 20 16:26:22.677769 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:22.677741 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a5945ad_40e0_4bf0_9712_79d17a1c8d00.slice/crio-96571697ed01ee33fbc9b4ea56afa6d323bf0a6a426a4c4351e4cab2552aea91 WatchSource:0}: Error finding container 96571697ed01ee33fbc9b4ea56afa6d323bf0a6a426a4c4351e4cab2552aea91: Status 404 returned error can't find the container with id 96571697ed01ee33fbc9b4ea56afa6d323bf0a6a426a4c4351e4cab2552aea91 Apr 20 16:26:23.464737 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:23.464711 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cbcb584c5-w4vng" Apr 20 16:26:23.529386 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:23.529348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nbjkl" event={"ID":"7a5945ad-40e0-4bf0-9712-79d17a1c8d00","Type":"ContainerStarted","Data":"96571697ed01ee33fbc9b4ea56afa6d323bf0a6a426a4c4351e4cab2552aea91"} Apr 20 16:26:24.533632 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.533596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" event={"ID":"1fdf2c2e-403d-4950-aec2-06e63346304a","Type":"ContainerStarted","Data":"d5c30e137fc9cc5602842eea6f8dfa01b6f8fb703116a8e30f8df020f62f4c49"} Apr 20 16:26:24.534942 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.534918 2577 generic.go:358] "Generic (PLEG): container finished" podID="7a5945ad-40e0-4bf0-9712-79d17a1c8d00" containerID="f6e8c086f293b46bb7b2c5070efd4604f80c5cd9c4475b51daee1bfcfe7555f0" exitCode=0 Apr 20 16:26:24.535060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.534987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nbjkl" event={"ID":"7a5945ad-40e0-4bf0-9712-79d17a1c8d00","Type":"ContainerDied","Data":"f6e8c086f293b46bb7b2c5070efd4604f80c5cd9c4475b51daee1bfcfe7555f0"} Apr 20 16:26:24.550993 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.550946 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g4x2w" podStartSLOduration=2.107109947 podStartE2EDuration="3.55093109s" podCreationTimestamp="2026-04-20 16:26:21 +0000 UTC" firstStartedPulling="2026-04-20 16:26:22.316792835 +0000 UTC m=+186.896362830" lastFinishedPulling="2026-04-20 16:26:23.760613978 +0000 UTC m=+188.340183973" observedRunningTime="2026-04-20 16:26:24.548950731 +0000 UTC m=+189.128520747" watchObservedRunningTime="2026-04-20 16:26:24.55093109 +0000 UTC m=+189.130501106" Apr 20 16:26:24.852473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.852396 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7568dc77f9-wmsz7"] Apr 20 16:26:24.856012 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.855994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.858765 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.858737 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 16:26:24.858898 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.858769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 16:26:24.858960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.858947 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pd8h9\"" Apr 20 16:26:24.859028 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.859013 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 16:26:24.859096 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.859043 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-clc648ibdj30m\"" Apr 20 16:26:24.859096 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.859075 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 16:26:24.859236 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.859049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 16:26:24.866129 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.866105 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7568dc77f9-wmsz7"] Apr 20 16:26:24.965663 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.965835 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-grpc-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.965835 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.965835 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.965835 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kg6x\" (UniqueName: \"kubernetes.io/projected/a6118926-0076-4fd9-aa1f-aa052f9810d1-kube-api-access-6kg6x\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.965835 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.966034 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:24.966034 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:24.965899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6118926-0076-4fd9-aa1f-aa052f9810d1-metrics-client-ca\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067088 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067088 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kg6x\" (UniqueName: \"kubernetes.io/projected/a6118926-0076-4fd9-aa1f-aa052f9810d1-kube-api-access-6kg6x\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067339 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6118926-0076-4fd9-aa1f-aa052f9810d1-metrics-client-ca\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067550 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.067550 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.067402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-grpc-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.068746 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.068719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6118926-0076-4fd9-aa1f-aa052f9810d1-metrics-client-ca\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.069874 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.069850 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.070048 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.070029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.070340 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.070320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.070408 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.070340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.070476 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.070461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.070532 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.070514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6118926-0076-4fd9-aa1f-aa052f9810d1-secret-grpc-tls\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.074969 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.074949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kg6x\" (UniqueName: \"kubernetes.io/projected/a6118926-0076-4fd9-aa1f-aa052f9810d1-kube-api-access-6kg6x\") pod \"thanos-querier-7568dc77f9-wmsz7\" (UID: \"a6118926-0076-4fd9-aa1f-aa052f9810d1\") " pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.165630 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.165605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:25.293377 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.293344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7568dc77f9-wmsz7"] Apr 20 16:26:25.298184 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:25.298156 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6118926_0076_4fd9_aa1f_aa052f9810d1.slice/crio-290ece8a0996bf71a94b87cbdbe1324a5c157fc863b466ed9d1a5458b3051259 WatchSource:0}: Error finding container 290ece8a0996bf71a94b87cbdbe1324a5c157fc863b466ed9d1a5458b3051259: Status 404 returned error can't find the container with id 290ece8a0996bf71a94b87cbdbe1324a5c157fc863b466ed9d1a5458b3051259 Apr 20 16:26:25.539313 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.539232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nbjkl" event={"ID":"7a5945ad-40e0-4bf0-9712-79d17a1c8d00","Type":"ContainerStarted","Data":"35e348d3c447c378996d88d93f96e1bac7625f756cd93072cac40e4859567b38"} Apr 20 16:26:25.539313 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.539266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nbjkl" event={"ID":"7a5945ad-40e0-4bf0-9712-79d17a1c8d00","Type":"ContainerStarted","Data":"f874d99d3194f92d87eaa0d71157893bedb20fbb5d15107a90054003324db076"} Apr 20 16:26:25.540321 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.540300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"290ece8a0996bf71a94b87cbdbe1324a5c157fc863b466ed9d1a5458b3051259"} Apr 20 16:26:25.556779 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:25.556741 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nbjkl" podStartSLOduration=3.476940617 podStartE2EDuration="4.556728777s" podCreationTimestamp="2026-04-20 16:26:21 +0000 UTC" firstStartedPulling="2026-04-20 16:26:22.679300253 +0000 UTC m=+187.258870246" lastFinishedPulling="2026-04-20 16:26:23.759088398 +0000 UTC m=+188.338658406" observedRunningTime="2026-04-20 16:26:25.55516303 +0000 UTC m=+190.134733116" watchObservedRunningTime="2026-04-20 16:26:25.556728777 +0000 UTC m=+190.136298791" Apr 20 16:26:26.274713 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.274617 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" podUID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" containerName="registry" containerID="cri-o://f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa" gracePeriod=30 Apr 20 16:26:26.537763 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.534853 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:26:26.544728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.541111 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c"] Apr 20 16:26:26.544728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.541502 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" containerName="registry" Apr 20 16:26:26.544728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.541517 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" containerName="registry" Apr 20 16:26:26.544728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.541603 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" containerName="registry" Apr 20 16:26:26.550636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.550612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:26.552409 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.552382 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" containerID="f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa" exitCode=0 Apr 20 16:26:26.552780 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.552757 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" Apr 20 16:26:26.552878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.552809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" event={"ID":"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74","Type":"ContainerDied","Data":"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa"} Apr 20 16:26:26.552878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.552848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56c6c65dfd-rtc66" event={"ID":"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74","Type":"ContainerDied","Data":"b62273781032ee65f1c9858417746f7820d75d23d36dfc4ed55cfa118b6daec4"} Apr 20 16:26:26.552878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.552869 2577 scope.go:117] "RemoveContainer" containerID="f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa" Apr 20 16:26:26.553103 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.553087 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 16:26:26.553154 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.553089 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kc2km\"" Apr 20 16:26:26.560890 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.560864 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c"] Apr 20 16:26:26.564104 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.564079 2577 scope.go:117] "RemoveContainer" containerID="f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa" Apr 20 16:26:26.564509 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:26:26.564483 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa\": container with ID starting with f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa not found: ID does not exist" containerID="f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa" Apr 20 16:26:26.564596 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.564522 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa"} err="failed to get container status \"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa\": rpc error: code = NotFound desc = could not find container \"f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa\": container with ID starting with f1c7b57051955759b31d428bee218b31314b7957bbe10bd83059adb87f5dc1fa not found: ID does not exist" Apr 20 16:26:26.681936 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.681899 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682107 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.681995 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682107 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682025 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682107 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682076 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682110 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682144 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cnc9\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682176 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682214 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates\") pod \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\" (UID: \"e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74\") " Apr 20 16:26:26.682444 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.682409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sn85c\" (UID: \"20bbc6cc-406c-47b9-b3dc-02361db2b16e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:26.683106 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.683072 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:26.684137 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.684104 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:26.685008 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.684950 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:26.685008 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.684992 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:26.685144 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.685076 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:26.685144 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.685089 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9" (OuterVolumeSpecName: "kube-api-access-2cnc9") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "kube-api-access-2cnc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:26.685881 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.685863 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:26.694300 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.694243 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" (UID: "e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:26:26.782895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.782864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sn85c\" (UID: \"20bbc6cc-406c-47b9-b3dc-02361db2b16e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:26.783049 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.782999 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-image-registry-private-configuration\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783049 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:26:26.783005 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 16:26:26.783049 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783018 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-tls\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783049 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783042 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-ca-trust-extracted\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783058 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-trusted-ca\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:26:26.783072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert podName:20bbc6cc-406c-47b9-b3dc-02361db2b16e nodeName:}" failed. No retries permitted until 2026-04-20 16:26:27.283058159 +0000 UTC m=+191.862628155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-sn85c" (UID: "20bbc6cc-406c-47b9-b3dc-02361db2b16e") : secret "monitoring-plugin-cert" not found Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783094 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cnc9\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-kube-api-access-2cnc9\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783106 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-bound-sa-token\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783116 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-registry-certificates\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.783200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.783125 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74-installation-pull-secrets\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:26:26.877756 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.877721 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:26:26.879470 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:26.879447 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56c6c65dfd-rtc66"] Apr 20 16:26:27.160868 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.160835 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:27.161066 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.160881 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:27.166192 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.166171 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:27.287105 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.287064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sn85c\" (UID: \"20bbc6cc-406c-47b9-b3dc-02361db2b16e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:27.289840 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.289814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/20bbc6cc-406c-47b9-b3dc-02361db2b16e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sn85c\" (UID: \"20bbc6cc-406c-47b9-b3dc-02361db2b16e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:27.464886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.464809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:27.564512 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.564484 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:26:27.687802 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:27.687775 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c"] Apr 20 16:26:27.691148 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:27.691117 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20bbc6cc_406c_47b9_b3dc_02361db2b16e.slice/crio-a32c22a35e5e283f23b302b7089183ae17c63d84a0c4297fa800ce0c30fe59e2 WatchSource:0}: Error finding container a32c22a35e5e283f23b302b7089183ae17c63d84a0c4297fa800ce0c30fe59e2: Status 404 returned error can't find the container with id a32c22a35e5e283f23b302b7089183ae17c63d84a0c4297fa800ce0c30fe59e2 Apr 20 16:26:28.011087 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.009048 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74" path="/var/lib/kubelet/pods/e5087d1c-edc4-4d05-9d3d-8d29d7a2bc74/volumes" Apr 20 16:26:28.095005 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.094974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 16:26:28.098780 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.098763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.101392 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 16:26:28.101392 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-erdtn5hjl6k93\"" Apr 20 16:26:28.101636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 16:26:28.101636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101470 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 16:26:28.101828 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101809 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 16:26:28.101886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 16:26:28.101934 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 16:26:28.101934 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.101923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 16:26:28.102092 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.102068 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 16:26:28.102183 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.102127 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 16:26:28.102237 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.102219 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 16:26:28.102878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.102861 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 16:26:28.102941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.102882 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qwxvk\"" Apr 20 16:26:28.105092 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.105071 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 16:26:28.107450 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.107434 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 16:26:28.112407 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.112388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 16:26:28.194761 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.194761 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195006 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195006 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195006 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195006 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.194961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195397 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195397 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6d2\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-kube-api-access-ml6d2\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195397 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195397 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.195595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.195530 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.296919 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.296834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.296919 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.296884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297141 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297141 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6d2\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-kube-api-access-ml6d2\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297141 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297291 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297291 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297291 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297291 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297487 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297487 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297487 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297487 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.297717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.297635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.299496 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.298529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.299623 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.299587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.299999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.299871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.300749 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.300428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.300849 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.300769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.301662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.301306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.301662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.301411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7b4c044-a14a-40b8-ba06-13128cd7878a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.301662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.301489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.302156 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.302135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.302346 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.302323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7b4c044-a14a-40b8-ba06-13128cd7878a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.302793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.302770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.303957 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.303916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.304041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.303968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.304041 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.304001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.304270 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.304249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.305310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.305289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-config\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.305438 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.305417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7b4c044-a14a-40b8-ba06-13128cd7878a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.307366 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.307342 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6d2\" (UniqueName: \"kubernetes.io/projected/f7b4c044-a14a-40b8-ba06-13128cd7878a-kube-api-access-ml6d2\") pod \"prometheus-k8s-0\" (UID: \"f7b4c044-a14a-40b8-ba06-13128cd7878a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.409995 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.409961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:28.565169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.565089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"86811a361d4f80f9a680680d90bced74d2dfcbeafbec67ca15263628f0959e7f"} Apr 20 16:26:28.565169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.565136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"4f70c8299ed313af9bf202bf110c92e284cfda57062dd745eefd570b6d1b9b9b"} Apr 20 16:26:28.565169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.565150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"f33a92f2f2be1e2173946ea1fa4cc91943b4426197e19aaf312e8f1379c35563"} Apr 20 16:26:28.566606 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.566574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" event={"ID":"20bbc6cc-406c-47b9-b3dc-02361db2b16e","Type":"ContainerStarted","Data":"a32c22a35e5e283f23b302b7089183ae17c63d84a0c4297fa800ce0c30fe59e2"} Apr 20 16:26:28.627807 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:28.627776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 16:26:28.888023 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:26:28.887992 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b4c044_a14a_40b8_ba06_13128cd7878a.slice/crio-7945bced12f4775dcdafc9b27efb48d3b863f677a5ffee01dc7d921eb16cde92 WatchSource:0}: Error finding container 7945bced12f4775dcdafc9b27efb48d3b863f677a5ffee01dc7d921eb16cde92: Status 404 returned error can't find the container with id 7945bced12f4775dcdafc9b27efb48d3b863f677a5ffee01dc7d921eb16cde92 Apr 20 16:26:29.575119 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.575077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"1fe79f53826f0356155afbc46a1c5943d248df351256c56b98b6bd2f755dc8ab"} Apr 20 16:26:29.575119 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.575119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"d6633b9212fcebe215f790bae448dc08d9d9173923b66bdb6e49f4fabedb2c93"} Apr 20 16:26:29.575592 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.575133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" event={"ID":"a6118926-0076-4fd9-aa1f-aa052f9810d1","Type":"ContainerStarted","Data":"720be0af5bc5000d0d5c1a4c12fed6105bba03dad4c6929a06be2ad1660e7304"} Apr 20 16:26:29.575592 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.575302 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:29.577342 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.577317 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" event={"ID":"20bbc6cc-406c-47b9-b3dc-02361db2b16e","Type":"ContainerStarted","Data":"e6deb6ce75d48b331ecfe552d88094d6cbb2a07171ac4ce1498b24153bc9fdab"} Apr 20 16:26:29.577518 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.577494 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:29.578447 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.578404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"7945bced12f4775dcdafc9b27efb48d3b863f677a5ffee01dc7d921eb16cde92"} Apr 20 16:26:29.584153 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.584116 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" Apr 20 16:26:29.599185 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.599148 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" podStartSLOduration=2.010414838 podStartE2EDuration="5.59913177s" podCreationTimestamp="2026-04-20 16:26:24 +0000 UTC" firstStartedPulling="2026-04-20 16:26:25.299987955 +0000 UTC m=+189.879557948" lastFinishedPulling="2026-04-20 16:26:28.888704887 +0000 UTC m=+193.468274880" observedRunningTime="2026-04-20 16:26:29.596660451 +0000 UTC m=+194.176230465" watchObservedRunningTime="2026-04-20 16:26:29.59913177 +0000 UTC m=+194.178701805" Apr 20 16:26:29.612186 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:29.612147 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sn85c" podStartSLOduration=2.371178076 podStartE2EDuration="3.612135767s" podCreationTimestamp="2026-04-20 16:26:26 +0000 UTC" firstStartedPulling="2026-04-20 16:26:27.69391392 +0000 UTC m=+192.273483916" lastFinishedPulling="2026-04-20 16:26:28.9348716 +0000 UTC m=+193.514441607" observedRunningTime="2026-04-20 16:26:29.610593629 +0000 UTC m=+194.190163644" watchObservedRunningTime="2026-04-20 16:26:29.612135767 +0000 UTC m=+194.191705782" Apr 20 16:26:30.582943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:30.582902 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7b4c044-a14a-40b8-ba06-13128cd7878a" containerID="51dc27dda0987466bc72df17ae408769e0db2edc3095860e79c2104697852659" exitCode=0 Apr 20 16:26:30.583362 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:30.582986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerDied","Data":"51dc27dda0987466bc72df17ae408769e0db2edc3095860e79c2104697852659"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599870 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"4a105525fb93245b110320bb0681ea01d30d224298d9ca06f37115a82812259b"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"bd20be25e770c0084280925ae5536fc7edb7b46f81dd97c31405555518e69620"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"4f7884d51f5c7a975d4b92200027c52e0937be39adbcc547591735a4df60a92f"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"3ec536123ec8a8805739403cc03768eb4bb6b9ef0f60a7cc3d85ddf2b4204ac6"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"e0dc1311923947a5e75f3e2f1153435e45c8a2b1e8fa66898e43fd7ac12d7d6c"} Apr 20 16:26:34.599950 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.599943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7b4c044-a14a-40b8-ba06-13128cd7878a","Type":"ContainerStarted","Data":"809d10b35e02d11ca8a1c77253998ae0a9f17bf74ae3a5c447761bca59c101e6"} Apr 20 16:26:34.627755 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:34.627705 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.508142651 podStartE2EDuration="6.627669756s" podCreationTimestamp="2026-04-20 16:26:28 +0000 UTC" firstStartedPulling="2026-04-20 16:26:28.889782203 +0000 UTC m=+193.469352195" lastFinishedPulling="2026-04-20 16:26:34.009309293 +0000 UTC m=+198.588879300" observedRunningTime="2026-04-20 16:26:34.625602915 +0000 UTC m=+199.205172939" watchObservedRunningTime="2026-04-20 16:26:34.627669756 +0000 UTC m=+199.207239771" Apr 20 16:26:35.589004 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:35.588979 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7568dc77f9-wmsz7" Apr 20 16:26:38.410262 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:38.410232 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:26:43.218082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:43.218044 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:26:52.652730 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:52.652697 2577 generic.go:358] "Generic (PLEG): container finished" podID="1380f1d3-60b6-4093-a429-bd909b4729e8" containerID="fad147ce1aabe2de6fb068692ddcb5b61e708cf8aef73e80a4b0b18b43680b60" exitCode=0 Apr 20 16:26:52.653135 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:52.652755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k72wm" event={"ID":"1380f1d3-60b6-4093-a429-bd909b4729e8","Type":"ContainerDied","Data":"fad147ce1aabe2de6fb068692ddcb5b61e708cf8aef73e80a4b0b18b43680b60"} Apr 20 16:26:52.653194 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:52.653170 2577 scope.go:117] "RemoveContainer" containerID="fad147ce1aabe2de6fb068692ddcb5b61e708cf8aef73e80a4b0b18b43680b60" Apr 20 16:26:53.656967 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:53.656932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k72wm" event={"ID":"1380f1d3-60b6-4093-a429-bd909b4729e8","Type":"ContainerStarted","Data":"206c8a7e61d47af2cb6ebe2fc47147128261581b589a1b4b00a5282d01919ab8"} Apr 20 16:26:57.669011 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:57.668974 2577 generic.go:358] "Generic (PLEG): container finished" podID="f739da69-7df9-40a8-8c4e-cef36ba94452" containerID="b97eb6efbb6404e9caa81d44eecec04321df2226f29d9a87e972f5fecb9d3b33" exitCode=0 Apr 20 16:26:57.669399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:57.669028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" event={"ID":"f739da69-7df9-40a8-8c4e-cef36ba94452","Type":"ContainerDied","Data":"b97eb6efbb6404e9caa81d44eecec04321df2226f29d9a87e972f5fecb9d3b33"} Apr 20 16:26:57.669399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:57.669296 2577 scope.go:117] "RemoveContainer" containerID="b97eb6efbb6404e9caa81d44eecec04321df2226f29d9a87e972f5fecb9d3b33" Apr 20 16:26:58.673764 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:26:58.673724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t2pxd" event={"ID":"f739da69-7df9-40a8-8c4e-cef36ba94452","Type":"ContainerStarted","Data":"fa74a1122fc3509423c7b4bb771bbeb257b6ac3b3515ee94d8939373216fb512"} Apr 20 16:27:08.238353 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.238305 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b955679d8-5h54x" podUID="b3eee4c1-6627-470c-8e02-fa958abe4e96" containerName="console" containerID="cri-o://aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae" gracePeriod=15 Apr 20 16:27:08.475201 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.475180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b955679d8-5h54x_b3eee4c1-6627-470c-8e02-fa958abe4e96/console/0.log" Apr 20 16:27:08.475317 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.475241 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:27:08.549243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549160 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549205 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549223 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549491 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549246 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549491 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549319 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549491 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549351 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55bq\" (UniqueName: \"kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549491 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549377 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config\") pod \"b3eee4c1-6627-470c-8e02-fa958abe4e96\" (UID: \"b3eee4c1-6627-470c-8e02-fa958abe4e96\") " Apr 20 16:27:08.549814 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549706 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:27:08.549814 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549793 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:27:08.549961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549935 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config" (OuterVolumeSpecName: "console-config") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:27:08.550127 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.549977 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca" (OuterVolumeSpecName: "service-ca") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:27:08.551763 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.551730 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq" (OuterVolumeSpecName: "kube-api-access-j55bq") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "kube-api-access-j55bq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:27:08.551967 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.551947 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:27:08.551967 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.551958 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b3eee4c1-6627-470c-8e02-fa958abe4e96" (UID: "b3eee4c1-6627-470c-8e02-fa958abe4e96"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:27:08.650250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650212 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-oauth-config\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650246 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-trusted-ca-bundle\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650256 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-serving-cert\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650498 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650267 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-oauth-serving-cert\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650498 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650276 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-service-ca\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650498 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650285 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j55bq\" (UniqueName: \"kubernetes.io/projected/b3eee4c1-6627-470c-8e02-fa958abe4e96-kube-api-access-j55bq\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.650498 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.650293 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3eee4c1-6627-470c-8e02-fa958abe4e96-console-config\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:27:08.706999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.706975 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b955679d8-5h54x_b3eee4c1-6627-470c-8e02-fa958abe4e96/console/0.log" Apr 20 16:27:08.707142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.707011 2577 generic.go:358] "Generic (PLEG): container finished" podID="b3eee4c1-6627-470c-8e02-fa958abe4e96" containerID="aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae" exitCode=2 Apr 20 16:27:08.707142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.707071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b955679d8-5h54x" event={"ID":"b3eee4c1-6627-470c-8e02-fa958abe4e96","Type":"ContainerDied","Data":"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae"} Apr 20 16:27:08.707142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.707084 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b955679d8-5h54x" Apr 20 16:27:08.707142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.707099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b955679d8-5h54x" event={"ID":"b3eee4c1-6627-470c-8e02-fa958abe4e96","Type":"ContainerDied","Data":"786604f446b994aa55e6e6b633460367e9bcc2de8640b5c556c9bf71120f1cf3"} Apr 20 16:27:08.707142 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.707115 2577 scope.go:117] "RemoveContainer" containerID="aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae" Apr 20 16:27:08.715423 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.715405 2577 scope.go:117] "RemoveContainer" containerID="aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae" Apr 20 16:27:08.715711 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:27:08.715671 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae\": container with ID starting with aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae not found: ID does not exist" containerID="aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae" Apr 20 16:27:08.715781 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.715718 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae"} err="failed to get container status \"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae\": rpc error: code = NotFound desc = could not find container \"aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae\": container with ID starting with aa2e50eabea9176912d78884afbec722170b8b380db70cc1c38f6d22441d3fae not found: ID does not exist" Apr 20 16:27:08.728050 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.728016 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:27:08.731150 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:08.731124 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b955679d8-5h54x"] Apr 20 16:27:10.007174 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:10.007142 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3eee4c1-6627-470c-8e02-fa958abe4e96" path="/var/lib/kubelet/pods/b3eee4c1-6627-470c-8e02-fa958abe4e96/volumes" Apr 20 16:27:27.702831 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:27.702799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:27:27.705358 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:27.705332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec42a0e4-ff1e-48d5-8b45-fab851d223a4-metrics-certs\") pod \"network-metrics-daemon-mpvsq\" (UID: \"ec42a0e4-ff1e-48d5-8b45-fab851d223a4\") " pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:27:28.006364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.006283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:27:28.014509 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.014487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mpvsq" Apr 20 16:27:28.132370 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.132339 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mpvsq"] Apr 20 16:27:28.137911 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:27:28.137884 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec42a0e4_ff1e_48d5_8b45_fab851d223a4.slice/crio-ba3148d2707b554ea38e6d45225f35f18a6dfdcc7982682d2364e65e599a67f1 WatchSource:0}: Error finding container ba3148d2707b554ea38e6d45225f35f18a6dfdcc7982682d2364e65e599a67f1: Status 404 returned error can't find the container with id ba3148d2707b554ea38e6d45225f35f18a6dfdcc7982682d2364e65e599a67f1 Apr 20 16:27:28.410198 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.410162 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:27:28.430024 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.430001 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:27:28.764338 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.764252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mpvsq" event={"ID":"ec42a0e4-ff1e-48d5-8b45-fab851d223a4","Type":"ContainerStarted","Data":"ba3148d2707b554ea38e6d45225f35f18a6dfdcc7982682d2364e65e599a67f1"} Apr 20 16:27:28.779955 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:28.779930 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 16:27:29.769085 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:29.769047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mpvsq" event={"ID":"ec42a0e4-ff1e-48d5-8b45-fab851d223a4","Type":"ContainerStarted","Data":"d96383c7f9fc2661d97e226efe289fd7b2e5b63abe364e7ae3bdac9cb5bba95f"} Apr 20 16:27:29.769085 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:29.769085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mpvsq" event={"ID":"ec42a0e4-ff1e-48d5-8b45-fab851d223a4","Type":"ContainerStarted","Data":"a655da9da1ee281b8ac25aecf5b6b4c3bc3b818d870525c84905fe202b051162"} Apr 20 16:27:29.783807 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:27:29.783759 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mpvsq" podStartSLOduration=252.864649716 podStartE2EDuration="4m13.783742456s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:27:28.140044947 +0000 UTC m=+252.719614941" lastFinishedPulling="2026-04-20 16:27:29.059137684 +0000 UTC m=+253.638707681" observedRunningTime="2026-04-20 16:27:29.78348281 +0000 UTC m=+254.363052835" watchObservedRunningTime="2026-04-20 16:27:29.783742456 +0000 UTC m=+254.363312472" Apr 20 16:28:15.886773 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:28:15.886741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:28:15.887391 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:28:15.887187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:28:15.890129 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:28:15.890102 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:28:15.890546 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:28:15.890524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:28:15.896594 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:28:15.896575 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 16:29:22.356240 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.356164 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:29:22.356636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.356475 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3eee4c1-6627-470c-8e02-fa958abe4e96" containerName="console" Apr 20 16:29:22.356636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.356487 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3eee4c1-6627-470c-8e02-fa958abe4e96" containerName="console" Apr 20 16:29:22.356636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.356536 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3eee4c1-6627-470c-8e02-fa958abe4e96" containerName="console" Apr 20 16:29:22.359485 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.359469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.361868 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.361851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 16:29:22.361974 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.361852 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 16:29:22.361974 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.361939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-4m49s\"" Apr 20 16:29:22.374149 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.374121 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:29:22.477566 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.477536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.477566 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.477569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8swjt\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.578840 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.578805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.578983 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.578849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8swjt\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.587194 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.587164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.587307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.587233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8swjt\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt\") pod \"cert-manager-759f64656b-nxx69\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.683060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.683034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:29:22.800896 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.800863 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:29:22.805124 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:29:22.805095 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod927af189_fcc0_4edf_8771_e6af9efdee29.slice/crio-68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676 WatchSource:0}: Error finding container 68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676: Status 404 returned error can't find the container with id 68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676 Apr 20 16:29:22.806963 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:22.806945 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:29:23.078909 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:23.078837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nxx69" event={"ID":"927af189-fcc0-4edf-8771-e6af9efdee29","Type":"ContainerStarted","Data":"68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676"} Apr 20 16:29:26.088564 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:26.088527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nxx69" event={"ID":"927af189-fcc0-4edf-8771-e6af9efdee29","Type":"ContainerStarted","Data":"49e9cc2bb285ce9a7d8aaeb9fcad5f0590d85118bf71d7f9d128d45d5df6033e"} Apr 20 16:29:26.104884 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:26.104838 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-nxx69" podStartSLOduration=1.045135749 podStartE2EDuration="4.104825864s" podCreationTimestamp="2026-04-20 16:29:22 +0000 UTC" firstStartedPulling="2026-04-20 16:29:22.807102968 +0000 UTC m=+367.386672962" lastFinishedPulling="2026-04-20 16:29:25.866793083 +0000 UTC m=+370.446363077" observedRunningTime="2026-04-20 16:29:26.103209229 +0000 UTC m=+370.682779238" watchObservedRunningTime="2026-04-20 16:29:26.104825864 +0000 UTC m=+370.684395879" Apr 20 16:29:34.956135 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.956099 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll"] Apr 20 16:29:34.959413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.959397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:34.962083 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.962062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 16:29:34.962182 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.962087 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 16:29:34.962542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.962527 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 16:29:34.962841 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.962821 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 16:29:34.962920 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.962861 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kwhtg\"" Apr 20 16:29:34.974354 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:34.974330 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll"] Apr 20 16:29:35.085511 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.085483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.085722 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.085528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcsx\" (UniqueName: \"kubernetes.io/projected/4602079f-9989-439c-b82f-627276b14013-kube-api-access-vdcsx\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.085722 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.085564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.186247 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.186216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcsx\" (UniqueName: \"kubernetes.io/projected/4602079f-9989-439c-b82f-627276b14013-kube-api-access-vdcsx\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.186247 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.186251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.186415 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.186308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.188957 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.188930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.189054 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.188993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4602079f-9989-439c-b82f-627276b14013-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.198629 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.198612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcsx\" (UniqueName: \"kubernetes.io/projected/4602079f-9989-439c-b82f-627276b14013-kube-api-access-vdcsx\") pod \"opendatahub-operator-controller-manager-59c64b9875-zpqll\" (UID: \"4602079f-9989-439c-b82f-627276b14013\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.269642 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.269579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:35.393369 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:35.393276 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll"] Apr 20 16:29:35.396223 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:29:35.396189 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4602079f_9989_439c_b82f_627276b14013.slice/crio-266068bed6192094784050f0dff8a20eb69fe827fa04babb66002181b7dca3fd WatchSource:0}: Error finding container 266068bed6192094784050f0dff8a20eb69fe827fa04babb66002181b7dca3fd: Status 404 returned error can't find the container with id 266068bed6192094784050f0dff8a20eb69fe827fa04babb66002181b7dca3fd Apr 20 16:29:36.122499 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:36.122455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" event={"ID":"4602079f-9989-439c-b82f-627276b14013","Type":"ContainerStarted","Data":"266068bed6192094784050f0dff8a20eb69fe827fa04babb66002181b7dca3fd"} Apr 20 16:29:38.130258 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:38.130221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" event={"ID":"4602079f-9989-439c-b82f-627276b14013","Type":"ContainerStarted","Data":"ecf9a0ddcb3c08e0522f55a43c568013eb9760094391dcdaea31e5b6a7dab282"} Apr 20 16:29:38.130745 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:38.130280 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:38.154511 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:38.154451 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" podStartSLOduration=1.7467720610000002 podStartE2EDuration="4.154437151s" podCreationTimestamp="2026-04-20 16:29:34 +0000 UTC" firstStartedPulling="2026-04-20 16:29:35.397839105 +0000 UTC m=+379.977409099" lastFinishedPulling="2026-04-20 16:29:37.805503999 +0000 UTC m=+382.385074189" observedRunningTime="2026-04-20 16:29:38.152277398 +0000 UTC m=+382.731847414" watchObservedRunningTime="2026-04-20 16:29:38.154437151 +0000 UTC m=+382.734007166" Apr 20 16:29:44.778868 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.778835 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5"] Apr 20 16:29:44.782115 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.782099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.785099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.785076 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 16:29:44.786437 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.786419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 16:29:44.786664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.786647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 16:29:44.786789 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.786663 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qs2rf\"" Apr 20 16:29:44.786789 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.786663 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 16:29:44.786789 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.786710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:29:44.798723 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.798704 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5"] Apr 20 16:29:44.867365 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.867332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.867782 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.867371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vn2\" (UniqueName: \"kubernetes.io/projected/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-kube-api-access-h2vn2\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.867782 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.867397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.867782 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.867503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-manager-config\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.968076 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.968038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.968226 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.968091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vn2\" (UniqueName: \"kubernetes.io/projected/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-kube-api-access-h2vn2\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.968226 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.968130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.968340 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.968307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-manager-config\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.968892 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.968871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-manager-config\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.970791 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.970763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.970876 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.970816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-metrics-cert\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:44.984996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:44.984972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vn2\" (UniqueName: \"kubernetes.io/projected/0ed8847c-5faf-4b2b-bb88-57e8bff71d38-kube-api-access-h2vn2\") pod \"lws-controller-manager-d6fdb785c-g6pw5\" (UID: \"0ed8847c-5faf-4b2b-bb88-57e8bff71d38\") " pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:45.091408 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:45.091326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:45.213253 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:45.213229 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5"] Apr 20 16:29:45.215723 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:29:45.215693 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed8847c_5faf_4b2b_bb88_57e8bff71d38.slice/crio-fbafb85eaf396f8cec0f9737941965238a0fb86ff1215b514abf40ed4dc2ddeb WatchSource:0}: Error finding container fbafb85eaf396f8cec0f9737941965238a0fb86ff1215b514abf40ed4dc2ddeb: Status 404 returned error can't find the container with id fbafb85eaf396f8cec0f9737941965238a0fb86ff1215b514abf40ed4dc2ddeb Apr 20 16:29:46.159395 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:46.159358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" event={"ID":"0ed8847c-5faf-4b2b-bb88-57e8bff71d38","Type":"ContainerStarted","Data":"fbafb85eaf396f8cec0f9737941965238a0fb86ff1215b514abf40ed4dc2ddeb"} Apr 20 16:29:48.168379 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:48.168343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" event={"ID":"0ed8847c-5faf-4b2b-bb88-57e8bff71d38","Type":"ContainerStarted","Data":"cc54abf5944df899dadb4a21f33443917d651eef9ae0e01871a78fbda2a195ac"} Apr 20 16:29:48.168787 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:48.168405 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:29:48.185378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:48.185331 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" podStartSLOduration=1.429085046 podStartE2EDuration="4.185316165s" podCreationTimestamp="2026-04-20 16:29:44 +0000 UTC" firstStartedPulling="2026-04-20 16:29:45.217498158 +0000 UTC m=+389.797068151" lastFinishedPulling="2026-04-20 16:29:47.973729268 +0000 UTC m=+392.553299270" observedRunningTime="2026-04-20 16:29:48.184084246 +0000 UTC m=+392.763654272" watchObservedRunningTime="2026-04-20 16:29:48.185316165 +0000 UTC m=+392.764886180" Apr 20 16:29:49.136500 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:49.136468 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-zpqll" Apr 20 16:29:59.173887 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:29:59.173856 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-d6fdb785c-g6pw5" Apr 20 16:30:42.284026 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.283992 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg"] Apr 20 16:30:42.287262 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.287240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.290036 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.290003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 16:30:42.290169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.290087 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-g2nzf\"" Apr 20 16:30:42.290169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.290123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 16:30:42.290470 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.290453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 16:30:42.299550 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.299514 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg"] Apr 20 16:30:42.423412 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423412 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b09bbc22-938c-4dab-af76-b53bae73362a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b09bbc22-938c-4dab-af76-b53bae73362a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423669 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxsv\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-kube-api-access-4rxsv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.423901 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.423818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524649 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b09bbc22-938c-4dab-af76-b53bae73362a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524649 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b09bbc22-938c-4dab-af76-b53bae73362a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxsv\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-kube-api-access-4rxsv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.524900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525139 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525139 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.524937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.525195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525469 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.525352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525743 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.525400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.525471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.525824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.525729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b09bbc22-938c-4dab-af76-b53bae73362a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.527189 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.527167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b09bbc22-938c-4dab-af76-b53bae73362a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.527406 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.527385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b09bbc22-938c-4dab-af76-b53bae73362a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.532555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.532531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.533074 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.533053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxsv\" (UniqueName: \"kubernetes.io/projected/b09bbc22-938c-4dab-af76-b53bae73362a-kube-api-access-4rxsv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg\" (UID: \"b09bbc22-938c-4dab-af76-b53bae73362a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.599013 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.598928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:42.727580 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:42.727554 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg"] Apr 20 16:30:42.729902 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:30:42.729875 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb09bbc22_938c_4dab_af76_b53bae73362a.slice/crio-3b6b2d19bc665dc99d812338ec91367c732aadc8ee3a56190f73a915d577bf1a WatchSource:0}: Error finding container 3b6b2d19bc665dc99d812338ec91367c732aadc8ee3a56190f73a915d577bf1a: Status 404 returned error can't find the container with id 3b6b2d19bc665dc99d812338ec91367c732aadc8ee3a56190f73a915d577bf1a Apr 20 16:30:43.344423 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:43.344389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" event={"ID":"b09bbc22-938c-4dab-af76-b53bae73362a","Type":"ContainerStarted","Data":"3b6b2d19bc665dc99d812338ec91367c732aadc8ee3a56190f73a915d577bf1a"} Apr 20 16:30:47.556583 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:47.556546 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 16:30:47.556878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:47.556619 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 16:30:47.556878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:47.556644 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 16:30:48.363537 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:48.363506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" event={"ID":"b09bbc22-938c-4dab-af76-b53bae73362a","Type":"ContainerStarted","Data":"facd2f87075c564c6369f788258f1a7b78f2993c6f53b6919e2c33c698c48868"} Apr 20 16:30:48.384093 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:48.384039 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" podStartSLOduration=1.5595246619999998 podStartE2EDuration="6.384025999s" podCreationTimestamp="2026-04-20 16:30:42 +0000 UTC" firstStartedPulling="2026-04-20 16:30:42.731765094 +0000 UTC m=+447.311335087" lastFinishedPulling="2026-04-20 16:30:47.55626643 +0000 UTC m=+452.135836424" observedRunningTime="2026-04-20 16:30:48.382161642 +0000 UTC m=+452.961731656" watchObservedRunningTime="2026-04-20 16:30:48.384025999 +0000 UTC m=+452.963596013" Apr 20 16:30:48.599214 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:48.599180 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:48.603839 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:48.603816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:49.367289 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:49.367256 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:49.368328 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:49.368306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg" Apr 20 16:30:52.922767 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.922733 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:52.926044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.926028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:52.928913 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.928890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 16:30:52.929034 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.928971 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 16:30:52.930062 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.930045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-5cbxq\"" Apr 20 16:30:52.933879 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:52.933837 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:53.014985 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.014953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrpf\" (UniqueName: \"kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf\") pod \"kuadrant-operator-catalog-qfgsb\" (UID: \"783911b3-d06b-4a3b-9c07-2816d30380c7\") " pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:53.116167 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.116131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrpf\" (UniqueName: \"kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf\") pod \"kuadrant-operator-catalog-qfgsb\" (UID: \"783911b3-d06b-4a3b-9c07-2816d30380c7\") " pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:53.124793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.124756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrpf\" (UniqueName: \"kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf\") pod \"kuadrant-operator-catalog-qfgsb\" (UID: \"783911b3-d06b-4a3b-9c07-2816d30380c7\") " pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:53.237078 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.236987 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:53.289316 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.289284 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:53.359956 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.359877 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:53.362946 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:30:53.362915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783911b3_d06b_4a3b_9c07_2816d30380c7.slice/crio-18ec405cf0110d9d7ffff3879acac7894e0dec9da5664d0fde075b8e4c7303a3 WatchSource:0}: Error finding container 18ec405cf0110d9d7ffff3879acac7894e0dec9da5664d0fde075b8e4c7303a3: Status 404 returned error can't find the container with id 18ec405cf0110d9d7ffff3879acac7894e0dec9da5664d0fde075b8e4c7303a3 Apr 20 16:30:53.380143 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.380108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" event={"ID":"783911b3-d06b-4a3b-9c07-2816d30380c7","Type":"ContainerStarted","Data":"18ec405cf0110d9d7ffff3879acac7894e0dec9da5664d0fde075b8e4c7303a3"} Apr 20 16:30:53.497959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.497889 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tfs6s"] Apr 20 16:30:53.502084 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.502067 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:30:53.507867 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.507840 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tfs6s"] Apr 20 16:30:53.621292 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.621261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qdlz\" (UniqueName: \"kubernetes.io/projected/518a24e2-456f-4d66-af29-2e963b6537d8-kube-api-access-5qdlz\") pod \"kuadrant-operator-catalog-tfs6s\" (UID: \"518a24e2-456f-4d66-af29-2e963b6537d8\") " pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:30:53.722831 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.722795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qdlz\" (UniqueName: \"kubernetes.io/projected/518a24e2-456f-4d66-af29-2e963b6537d8-kube-api-access-5qdlz\") pod \"kuadrant-operator-catalog-tfs6s\" (UID: \"518a24e2-456f-4d66-af29-2e963b6537d8\") " pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:30:53.730809 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.730780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qdlz\" (UniqueName: \"kubernetes.io/projected/518a24e2-456f-4d66-af29-2e963b6537d8-kube-api-access-5qdlz\") pod \"kuadrant-operator-catalog-tfs6s\" (UID: \"518a24e2-456f-4d66-af29-2e963b6537d8\") " pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:30:53.812424 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.812358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:30:53.930903 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:53.930875 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tfs6s"] Apr 20 16:30:53.933828 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:30:53.933804 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518a24e2_456f_4d66_af29_2e963b6537d8.slice/crio-122a8ca2a6de3a8b684093913ee2bcc27c8f53f94e32eb6067e1094fba94af7f WatchSource:0}: Error finding container 122a8ca2a6de3a8b684093913ee2bcc27c8f53f94e32eb6067e1094fba94af7f: Status 404 returned error can't find the container with id 122a8ca2a6de3a8b684093913ee2bcc27c8f53f94e32eb6067e1094fba94af7f Apr 20 16:30:54.385114 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:54.385073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" event={"ID":"518a24e2-456f-4d66-af29-2e963b6537d8","Type":"ContainerStarted","Data":"122a8ca2a6de3a8b684093913ee2bcc27c8f53f94e32eb6067e1094fba94af7f"} Apr 20 16:30:56.394247 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.394202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" event={"ID":"783911b3-d06b-4a3b-9c07-2816d30380c7","Type":"ContainerStarted","Data":"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb"} Apr 20 16:30:56.394705 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.394346 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" podUID="783911b3-d06b-4a3b-9c07-2816d30380c7" containerName="registry-server" containerID="cri-o://6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb" gracePeriod=2 Apr 20 16:30:56.395792 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.395769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" event={"ID":"518a24e2-456f-4d66-af29-2e963b6537d8","Type":"ContainerStarted","Data":"e0853ace9803979edda839091c2c0812d452f7f3d48359a06364750107eda86d"} Apr 20 16:30:56.412435 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.412392 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" podStartSLOduration=1.875842188 podStartE2EDuration="4.412378491s" podCreationTimestamp="2026-04-20 16:30:52 +0000 UTC" firstStartedPulling="2026-04-20 16:30:53.364404615 +0000 UTC m=+457.943974608" lastFinishedPulling="2026-04-20 16:30:55.900940904 +0000 UTC m=+460.480510911" observedRunningTime="2026-04-20 16:30:56.410724324 +0000 UTC m=+460.990294339" watchObservedRunningTime="2026-04-20 16:30:56.412378491 +0000 UTC m=+460.991948505" Apr 20 16:30:56.427974 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.427932 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" podStartSLOduration=1.46576554 podStartE2EDuration="3.427920958s" podCreationTimestamp="2026-04-20 16:30:53 +0000 UTC" firstStartedPulling="2026-04-20 16:30:53.935108492 +0000 UTC m=+458.514678485" lastFinishedPulling="2026-04-20 16:30:55.897263893 +0000 UTC m=+460.476833903" observedRunningTime="2026-04-20 16:30:56.425396581 +0000 UTC m=+461.004966607" watchObservedRunningTime="2026-04-20 16:30:56.427920958 +0000 UTC m=+461.007490973" Apr 20 16:30:56.633648 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.633627 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:56.751049 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.750968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfrpf\" (UniqueName: \"kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf\") pod \"783911b3-d06b-4a3b-9c07-2816d30380c7\" (UID: \"783911b3-d06b-4a3b-9c07-2816d30380c7\") " Apr 20 16:30:56.753181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.753158 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf" (OuterVolumeSpecName: "kube-api-access-xfrpf") pod "783911b3-d06b-4a3b-9c07-2816d30380c7" (UID: "783911b3-d06b-4a3b-9c07-2816d30380c7"). InnerVolumeSpecName "kube-api-access-xfrpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:30:56.851949 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:56.851922 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfrpf\" (UniqueName: \"kubernetes.io/projected/783911b3-d06b-4a3b-9c07-2816d30380c7-kube-api-access-xfrpf\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:30:57.400394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.400355 2577 generic.go:358] "Generic (PLEG): container finished" podID="783911b3-d06b-4a3b-9c07-2816d30380c7" containerID="6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb" exitCode=0 Apr 20 16:30:57.400820 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.400410 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" Apr 20 16:30:57.400820 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.400435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" event={"ID":"783911b3-d06b-4a3b-9c07-2816d30380c7","Type":"ContainerDied","Data":"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb"} Apr 20 16:30:57.400820 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.400470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qfgsb" event={"ID":"783911b3-d06b-4a3b-9c07-2816d30380c7","Type":"ContainerDied","Data":"18ec405cf0110d9d7ffff3879acac7894e0dec9da5664d0fde075b8e4c7303a3"} Apr 20 16:30:57.400820 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.400485 2577 scope.go:117] "RemoveContainer" containerID="6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb" Apr 20 16:30:57.408845 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.408828 2577 scope.go:117] "RemoveContainer" containerID="6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb" Apr 20 16:30:57.409085 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:30:57.409069 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb\": container with ID starting with 6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb not found: ID does not exist" containerID="6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb" Apr 20 16:30:57.409145 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.409096 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb"} err="failed to get container status \"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb\": rpc error: code = NotFound desc = could not find container \"6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb\": container with ID starting with 6062aa3027e86373d00b205e1fe5fd70b5c500883cd5b8b63d2fa2dd111c9efb not found: ID does not exist" Apr 20 16:30:57.419611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.419590 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:57.423667 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:57.423650 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qfgsb"] Apr 20 16:30:58.007767 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:30:58.007739 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783911b3-d06b-4a3b-9c07-2816d30380c7" path="/var/lib/kubelet/pods/783911b3-d06b-4a3b-9c07-2816d30380c7/volumes" Apr 20 16:31:03.812778 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:03.812747 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:31:03.812778 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:03.812783 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:31:03.833779 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:03.833751 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:31:04.444726 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:04.444700 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-tfs6s" Apr 20 16:31:24.286791 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.286757 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv"] Apr 20 16:31:24.287307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.287151 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="783911b3-d06b-4a3b-9c07-2816d30380c7" containerName="registry-server" Apr 20 16:31:24.287307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.287163 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="783911b3-d06b-4a3b-9c07-2816d30380c7" containerName="registry-server" Apr 20 16:31:24.287307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.287220 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="783911b3-d06b-4a3b-9c07-2816d30380c7" containerName="registry-server" Apr 20 16:31:24.297915 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.297890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:24.301850 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.301762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 16:31:24.302217 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.301829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-v5bzh\"" Apr 20 16:31:24.302563 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.302538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv"] Apr 20 16:31:24.375235 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.375195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hw9v\" (UniqueName: \"kubernetes.io/projected/f0a39ad3-9c1c-45dc-a8f1-23acbd821727-kube-api-access-5hw9v\") pod \"dns-operator-controller-manager-648d5c98bc-4sjtv\" (UID: \"f0a39ad3-9c1c-45dc-a8f1-23acbd821727\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:24.476757 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.476723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hw9v\" (UniqueName: \"kubernetes.io/projected/f0a39ad3-9c1c-45dc-a8f1-23acbd821727-kube-api-access-5hw9v\") pod \"dns-operator-controller-manager-648d5c98bc-4sjtv\" (UID: \"f0a39ad3-9c1c-45dc-a8f1-23acbd821727\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:24.487059 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.487029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hw9v\" (UniqueName: \"kubernetes.io/projected/f0a39ad3-9c1c-45dc-a8f1-23acbd821727-kube-api-access-5hw9v\") pod \"dns-operator-controller-manager-648d5c98bc-4sjtv\" (UID: \"f0a39ad3-9c1c-45dc-a8f1-23acbd821727\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:24.608866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.608787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:24.735624 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:24.735535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv"] Apr 20 16:31:24.738443 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:24.738414 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0a39ad3_9c1c_45dc_a8f1_23acbd821727.slice/crio-562568ed8bb4f24369f6c8ed24faa950237b58567ec42e86a9fc346961734534 WatchSource:0}: Error finding container 562568ed8bb4f24369f6c8ed24faa950237b58567ec42e86a9fc346961734534: Status 404 returned error can't find the container with id 562568ed8bb4f24369f6c8ed24faa950237b58567ec42e86a9fc346961734534 Apr 20 16:31:25.497043 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:25.496986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" event={"ID":"f0a39ad3-9c1c-45dc-a8f1-23acbd821727","Type":"ContainerStarted","Data":"562568ed8bb4f24369f6c8ed24faa950237b58567ec42e86a9fc346961734534"} Apr 20 16:31:27.505889 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:27.505844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" event={"ID":"f0a39ad3-9c1c-45dc-a8f1-23acbd821727","Type":"ContainerStarted","Data":"9b0d072466a75587df0379e7063e7c97dda6b257eb31bbafbd6a589d772da7e7"} Apr 20 16:31:27.506232 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:27.505966 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:27.522868 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:27.522785 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" podStartSLOduration=0.995025367 podStartE2EDuration="3.522771742s" podCreationTimestamp="2026-04-20 16:31:24 +0000 UTC" firstStartedPulling="2026-04-20 16:31:24.74042023 +0000 UTC m=+489.319990223" lastFinishedPulling="2026-04-20 16:31:27.268166605 +0000 UTC m=+491.847736598" observedRunningTime="2026-04-20 16:31:27.521116047 +0000 UTC m=+492.100686053" watchObservedRunningTime="2026-04-20 16:31:27.522771742 +0000 UTC m=+492.102341756" Apr 20 16:31:29.616972 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.616944 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf"] Apr 20 16:31:29.620324 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.620309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.623033 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.623007 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-lk4cg\"" Apr 20 16:31:29.635233 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.635209 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf"] Apr 20 16:31:29.718556 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.718523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljlw\" (UniqueName: \"kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.718723 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.718566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.819373 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.819338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljlw\" (UniqueName: \"kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.819539 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.819387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.819759 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.819742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.830236 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.830210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljlw\" (UniqueName: \"kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:29.930878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:29.930847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:30.064439 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:30.064409 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf"] Apr 20 16:31:30.067762 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:30.067726 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aeaa71d_91d0_412c_acec_5048003d3c63.slice/crio-19c210fd2cacc293668f0de7784b14e1f4a700fd1286fe4ea942e1ffe46817d9 WatchSource:0}: Error finding container 19c210fd2cacc293668f0de7784b14e1f4a700fd1286fe4ea942e1ffe46817d9: Status 404 returned error can't find the container with id 19c210fd2cacc293668f0de7784b14e1f4a700fd1286fe4ea942e1ffe46817d9 Apr 20 16:31:30.517153 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:30.517120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" event={"ID":"2aeaa71d-91d0-412c-acec-5048003d3c63","Type":"ContainerStarted","Data":"19c210fd2cacc293668f0de7784b14e1f4a700fd1286fe4ea942e1ffe46817d9"} Apr 20 16:31:33.376831 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.376786 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d58b87c9b-lbxmv"] Apr 20 16:31:33.380171 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.380155 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.387075 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 16:31:33.387220 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387085 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 16:31:33.387863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 16:31:33.387863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387324 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 16:31:33.387863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 16:31:33.387863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mqch4\"" Apr 20 16:31:33.387863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387551 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 16:31:33.388188 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.387907 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 16:31:33.397547 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.397471 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 16:31:33.401124 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.401106 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d58b87c9b-lbxmv"] Apr 20 16:31:33.449040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-oauth-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcdk\" (UniqueName: \"kubernetes.io/projected/959a25ec-33d1-43da-883f-75b832181add-kube-api-access-9dcdk\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-console-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-trusted-ca-bundle\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-oauth-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449335 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.449335 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.449277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-service-ca\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.550639 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-oauth-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.550823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.550823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-service-ca\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.550823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-oauth-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.550823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcdk\" (UniqueName: \"kubernetes.io/projected/959a25ec-33d1-43da-883f-75b832181add-kube-api-access-9dcdk\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551067 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-console-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551067 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.550883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-trusted-ca-bundle\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551493 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.551465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-oauth-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551584 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.551505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-service-ca\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551584 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.551530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-console-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.551660 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.551622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/959a25ec-33d1-43da-883f-75b832181add-trusted-ca-bundle\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.553902 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.553880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-serving-cert\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.553999 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.553881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/959a25ec-33d1-43da-883f-75b832181add-console-oauth-config\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.560394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.560372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcdk\" (UniqueName: \"kubernetes.io/projected/959a25ec-33d1-43da-883f-75b832181add-kube-api-access-9dcdk\") pod \"console-6d58b87c9b-lbxmv\" (UID: \"959a25ec-33d1-43da-883f-75b832181add\") " pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.690524 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.690448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:33.820855 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:33.820810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d58b87c9b-lbxmv"] Apr 20 16:31:33.824983 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:33.824952 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959a25ec_33d1_43da_883f_75b832181add.slice/crio-2a19ab14f555b36c64595d5bd39642e036162e651803f0124c2e168a2af03afc WatchSource:0}: Error finding container 2a19ab14f555b36c64595d5bd39642e036162e651803f0124c2e168a2af03afc: Status 404 returned error can't find the container with id 2a19ab14f555b36c64595d5bd39642e036162e651803f0124c2e168a2af03afc Apr 20 16:31:34.537297 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:34.537257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d58b87c9b-lbxmv" event={"ID":"959a25ec-33d1-43da-883f-75b832181add","Type":"ContainerStarted","Data":"21eb40bc45e518229e865c2b6ec3f956b22445e007d0d035c61ac23b207a7051"} Apr 20 16:31:34.537297 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:34.537292 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d58b87c9b-lbxmv" event={"ID":"959a25ec-33d1-43da-883f-75b832181add","Type":"ContainerStarted","Data":"2a19ab14f555b36c64595d5bd39642e036162e651803f0124c2e168a2af03afc"} Apr 20 16:31:34.558321 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:34.558108 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d58b87c9b-lbxmv" podStartSLOduration=1.558095361 podStartE2EDuration="1.558095361s" podCreationTimestamp="2026-04-20 16:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:31:34.557758907 +0000 UTC m=+499.137328922" watchObservedRunningTime="2026-04-20 16:31:34.558095361 +0000 UTC m=+499.137665375" Apr 20 16:31:35.945315 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:35.945286 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp"] Apr 20 16:31:35.948959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:35.948935 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:35.953115 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:35.953095 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-9h8jq\"" Apr 20 16:31:35.977635 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:35.977610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp"] Apr 20 16:31:36.074893 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.074849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5fg\" (UniqueName: \"kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg\") pod \"limitador-operator-controller-manager-85c4996f8c-s45qp\" (UID: \"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:36.175996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.175974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5fg\" (UniqueName: \"kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg\") pod \"limitador-operator-controller-manager-85c4996f8c-s45qp\" (UID: \"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:36.192612 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.192584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5fg\" (UniqueName: \"kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg\") pod \"limitador-operator-controller-manager-85c4996f8c-s45qp\" (UID: \"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:36.262054 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.262026 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:36.391947 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.391912 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp"] Apr 20 16:31:36.395912 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:36.395879 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02f5b18_cff1_4e86_88ee_b59fe3dbda2c.slice/crio-48f14f23285ecfd6c458e50f1df4ca1c233a566f67cf3069ea558e8bd0fd6bc3 WatchSource:0}: Error finding container 48f14f23285ecfd6c458e50f1df4ca1c233a566f67cf3069ea558e8bd0fd6bc3: Status 404 returned error can't find the container with id 48f14f23285ecfd6c458e50f1df4ca1c233a566f67cf3069ea558e8bd0fd6bc3 Apr 20 16:31:36.545910 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.545818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" event={"ID":"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c","Type":"ContainerStarted","Data":"48f14f23285ecfd6c458e50f1df4ca1c233a566f67cf3069ea558e8bd0fd6bc3"} Apr 20 16:31:36.547327 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.547299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" event={"ID":"2aeaa71d-91d0-412c-acec-5048003d3c63","Type":"ContainerStarted","Data":"cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e"} Apr 20 16:31:36.547468 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.547448 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:36.566933 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:36.566889 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" podStartSLOduration=1.463990724 podStartE2EDuration="7.566877161s" podCreationTimestamp="2026-04-20 16:31:29 +0000 UTC" firstStartedPulling="2026-04-20 16:31:30.070084616 +0000 UTC m=+494.649654608" lastFinishedPulling="2026-04-20 16:31:36.172971045 +0000 UTC m=+500.752541045" observedRunningTime="2026-04-20 16:31:36.564801202 +0000 UTC m=+501.144371216" watchObservedRunningTime="2026-04-20 16:31:36.566877161 +0000 UTC m=+501.146447176" Apr 20 16:31:38.511549 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:38.511521 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-4sjtv" Apr 20 16:31:38.557965 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:38.557924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" event={"ID":"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c","Type":"ContainerStarted","Data":"f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958"} Apr 20 16:31:38.558153 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:38.558049 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:38.575176 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:38.575089 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" podStartSLOduration=1.6581136 podStartE2EDuration="3.575074976s" podCreationTimestamp="2026-04-20 16:31:35 +0000 UTC" firstStartedPulling="2026-04-20 16:31:36.397957496 +0000 UTC m=+500.977527509" lastFinishedPulling="2026-04-20 16:31:38.314918768 +0000 UTC m=+502.894488885" observedRunningTime="2026-04-20 16:31:38.573625065 +0000 UTC m=+503.153195080" watchObservedRunningTime="2026-04-20 16:31:38.575074976 +0000 UTC m=+503.154644991" Apr 20 16:31:43.690769 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:43.690731 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:43.691140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:43.690816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:43.695421 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:43.695403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:44.582186 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:44.582157 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d58b87c9b-lbxmv" Apr 20 16:31:47.555174 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:47.555140 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:49.275216 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.275183 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf"] Apr 20 16:31:49.275599 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.275389 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" containerName="manager" containerID="cri-o://cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e" gracePeriod=2 Apr 20 16:31:49.277740 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.277702 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.279160 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.279136 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf"] Apr 20 16:31:49.299343 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.299314 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:31:49.299850 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.299831 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" containerName="manager" Apr 20 16:31:49.299919 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.299854 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" containerName="manager" Apr 20 16:31:49.299997 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.299984 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" containerName="manager" Apr 20 16:31:49.303163 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.303145 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.314268 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.314245 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:31:49.317039 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.317017 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp"] Apr 20 16:31:49.317297 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.317262 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" containerName="manager" containerID="cri-o://f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958" gracePeriod=2 Apr 20 16:31:49.319288 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.319270 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:49.332566 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.332354 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp"] Apr 20 16:31:49.332566 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.332470 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.334714 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.334612 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.336662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.336638 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.339256 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.339237 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4"] Apr 20 16:31:49.339608 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.339593 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" containerName="manager" Apr 20 16:31:49.339701 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.339612 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" containerName="manager" Apr 20 16:31:49.339785 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.339772 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" containerName="manager" Apr 20 16:31:49.344111 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.344092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:49.346458 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.346430 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.348451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.348432 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.353646 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.353627 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4"] Apr 20 16:31:49.392929 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.392905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.393033 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.392971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7nv5\" (UniqueName: \"kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.494351 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.494323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7nv5\" (UniqueName: \"kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.494481 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.494375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5z24\" (UniqueName: \"kubernetes.io/projected/96fdc2c6-3908-4bdd-99db-25e367a265ec-kube-api-access-q5z24\") pod \"limitador-operator-controller-manager-85c4996f8c-8hwh4\" (UID: \"96fdc2c6-3908-4bdd-99db-25e367a265ec\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:49.494481 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.494445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.494907 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.494884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.503904 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.503882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7nv5\" (UniqueName: \"kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jpcl7\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.541948 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.541930 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:49.544655 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.544628 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.545150 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.545136 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:49.546861 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.546836 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.548922 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.548902 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.550827 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.550807 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.594856 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.594835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5z24\" (UniqueName: \"kubernetes.io/projected/96fdc2c6-3908-4bdd-99db-25e367a265ec-kube-api-access-q5z24\") pod \"limitador-operator-controller-manager-85c4996f8c-8hwh4\" (UID: \"96fdc2c6-3908-4bdd-99db-25e367a265ec\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:49.596314 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.596290 2577 generic.go:358] "Generic (PLEG): container finished" podID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" containerID="f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958" exitCode=0 Apr 20 16:31:49.596382 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.596331 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" Apr 20 16:31:49.596460 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.596390 2577 scope.go:117] "RemoveContainer" containerID="f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958" Apr 20 16:31:49.597452 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.597432 2577 generic.go:358] "Generic (PLEG): container finished" podID="2aeaa71d-91d0-412c-acec-5048003d3c63" containerID="cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e" exitCode=0 Apr 20 16:31:49.597522 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.597475 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" Apr 20 16:31:49.598494 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.598467 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.600592 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.600571 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.602504 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.602483 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.603428 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.603408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5z24\" (UniqueName: \"kubernetes.io/projected/96fdc2c6-3908-4bdd-99db-25e367a265ec-kube-api-access-q5z24\") pod \"limitador-operator-controller-manager-85c4996f8c-8hwh4\" (UID: \"96fdc2c6-3908-4bdd-99db-25e367a265ec\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:49.604414 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.604390 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.605211 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.605196 2577 scope.go:117] "RemoveContainer" containerID="f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958" Apr 20 16:31:49.605457 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:31:49.605441 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958\": container with ID starting with f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958 not found: ID does not exist" containerID="f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958" Apr 20 16:31:49.605507 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.605466 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958"} err="failed to get container status \"f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958\": rpc error: code = NotFound desc = could not find container \"f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958\": container with ID starting with f1292d14b952883798a30b22f48b7b4e4d8b70f24ee7297fe9ade6e58cfb3958 not found: ID does not exist" Apr 20 16:31:49.605507 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.605482 2577 scope.go:117] "RemoveContainer" containerID="cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e" Apr 20 16:31:49.612667 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.612649 2577 scope.go:117] "RemoveContainer" containerID="cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e" Apr 20 16:31:49.612939 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:31:49.612921 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e\": container with ID starting with cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e not found: ID does not exist" containerID="cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e" Apr 20 16:31:49.612987 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.612949 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e"} err="failed to get container status \"cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e\": rpc error: code = NotFound desc = could not find container \"cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e\": container with ID starting with cf72e3c6b948f06c3b7a62db0ec5b9f86e81733849248c3098063ca1ae31f97e not found: ID does not exist" Apr 20 16:31:49.692512 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.692486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:49.695235 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.695217 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume\") pod \"2aeaa71d-91d0-412c-acec-5048003d3c63\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " Apr 20 16:31:49.695325 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.695308 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ljlw\" (UniqueName: \"kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw\") pod \"2aeaa71d-91d0-412c-acec-5048003d3c63\" (UID: \"2aeaa71d-91d0-412c-acec-5048003d3c63\") " Apr 20 16:31:49.695453 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.695334 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5fg\" (UniqueName: \"kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg\") pod \"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c\" (UID: \"e02f5b18-cff1-4e86-88ee-b59fe3dbda2c\") " Apr 20 16:31:49.695715 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.695695 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2aeaa71d-91d0-412c-acec-5048003d3c63" (UID: "2aeaa71d-91d0-412c-acec-5048003d3c63"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:49.697390 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.697368 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw" (OuterVolumeSpecName: "kube-api-access-6ljlw") pod "2aeaa71d-91d0-412c-acec-5048003d3c63" (UID: "2aeaa71d-91d0-412c-acec-5048003d3c63"). InnerVolumeSpecName "kube-api-access-6ljlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:49.697478 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.697416 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg" (OuterVolumeSpecName: "kube-api-access-pb5fg") pod "e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" (UID: "e02f5b18-cff1-4e86-88ee-b59fe3dbda2c"). InnerVolumeSpecName "kube-api-access-pb5fg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:49.698534 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.698506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:49.796399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.796361 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ljlw\" (UniqueName: \"kubernetes.io/projected/2aeaa71d-91d0-412c-acec-5048003d3c63-kube-api-access-6ljlw\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:31:49.796399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.796401 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pb5fg\" (UniqueName: \"kubernetes.io/projected/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c-kube-api-access-pb5fg\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:31:49.796595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.796417 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2aeaa71d-91d0-412c-acec-5048003d3c63-extensions-socket-volume\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:31:49.831126 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.831075 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:31:49.835473 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:49.835444 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3475ccff_df0c_40d0_ab56_95250651fe12.slice/crio-f184f32e49460d09636e4aeb19add00b7c45f32a00f91d3a554bea69b245e59f WatchSource:0}: Error finding container f184f32e49460d09636e4aeb19add00b7c45f32a00f91d3a554bea69b245e59f: Status 404 returned error can't find the container with id f184f32e49460d09636e4aeb19add00b7c45f32a00f91d3a554bea69b245e59f Apr 20 16:31:49.853737 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.853718 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4"] Apr 20 16:31:49.856866 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:31:49.856842 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fdc2c6_3908_4bdd_99db_25e367a265ec.slice/crio-9b73ea5f9eddc5b390a53200659ca542b6dd54b01bde24e070d80bbf81a7aa88 WatchSource:0}: Error finding container 9b73ea5f9eddc5b390a53200659ca542b6dd54b01bde24e070d80bbf81a7aa88: Status 404 returned error can't find the container with id 9b73ea5f9eddc5b390a53200659ca542b6dd54b01bde24e070d80bbf81a7aa88 Apr 20 16:31:49.909895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.909868 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.911867 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.911841 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.913971 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.913950 2577 status_manager.go:895] "Failed to get status for pod" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qltjf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qltjf\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:49.915844 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:49.915825 2577 status_manager.go:895] "Failed to get status for pod" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s45qp" err="pods \"limitador-operator-controller-manager-85c4996f8c-s45qp\" is forbidden: User \"system:node:ip-10-0-135-200.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-200.ec2.internal' and this object" Apr 20 16:31:50.008331 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.008297 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aeaa71d-91d0-412c-acec-5048003d3c63" path="/var/lib/kubelet/pods/2aeaa71d-91d0-412c-acec-5048003d3c63/volumes" Apr 20 16:31:50.008698 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.008659 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02f5b18-cff1-4e86-88ee-b59fe3dbda2c" path="/var/lib/kubelet/pods/e02f5b18-cff1-4e86-88ee-b59fe3dbda2c/volumes" Apr 20 16:31:50.602998 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.602909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" event={"ID":"3475ccff-df0c-40d0-ab56-95250651fe12","Type":"ContainerStarted","Data":"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd"} Apr 20 16:31:50.602998 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.602944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" event={"ID":"3475ccff-df0c-40d0-ab56-95250651fe12","Type":"ContainerStarted","Data":"f184f32e49460d09636e4aeb19add00b7c45f32a00f91d3a554bea69b245e59f"} Apr 20 16:31:50.603486 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.603074 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:31:50.604312 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.604286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" event={"ID":"96fdc2c6-3908-4bdd-99db-25e367a265ec","Type":"ContainerStarted","Data":"78e5e9d84f9b380070a08091f0e0116ee26291d5ad2bf819f28a7f8333deb23f"} Apr 20 16:31:50.604427 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.604316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" event={"ID":"96fdc2c6-3908-4bdd-99db-25e367a265ec","Type":"ContainerStarted","Data":"9b73ea5f9eddc5b390a53200659ca542b6dd54b01bde24e070d80bbf81a7aa88"} Apr 20 16:31:50.604468 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.604421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:31:50.624208 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.624167 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" podStartSLOduration=1.624155408 podStartE2EDuration="1.624155408s" podCreationTimestamp="2026-04-20 16:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:31:50.621699139 +0000 UTC m=+515.201269148" watchObservedRunningTime="2026-04-20 16:31:50.624155408 +0000 UTC m=+515.203725423" Apr 20 16:31:50.637425 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:31:50.637379 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" podStartSLOduration=1.637366408 podStartE2EDuration="1.637366408s" podCreationTimestamp="2026-04-20 16:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:31:50.635556316 +0000 UTC m=+515.215126330" watchObservedRunningTime="2026-04-20 16:31:50.637366408 +0000 UTC m=+515.216936500" Apr 20 16:32:01.609768 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:01.609737 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8hwh4" Apr 20 16:32:01.610143 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:01.609794 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:32:05.950165 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:05.950084 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:32:05.950662 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:05.950318 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" podUID="3475ccff-df0c-40d0-ab56-95250651fe12" containerName="manager" containerID="cri-o://6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd" gracePeriod=10 Apr 20 16:32:06.193277 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.193250 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:32:06.228981 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.228902 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7nv5\" (UniqueName: \"kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5\") pod \"3475ccff-df0c-40d0-ab56-95250651fe12\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " Apr 20 16:32:06.229109 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.229005 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume\") pod \"3475ccff-df0c-40d0-ab56-95250651fe12\" (UID: \"3475ccff-df0c-40d0-ab56-95250651fe12\") " Apr 20 16:32:06.229449 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.229421 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3475ccff-df0c-40d0-ab56-95250651fe12" (UID: "3475ccff-df0c-40d0-ab56-95250651fe12"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:32:06.231267 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.231240 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5" (OuterVolumeSpecName: "kube-api-access-q7nv5") pod "3475ccff-df0c-40d0-ab56-95250651fe12" (UID: "3475ccff-df0c-40d0-ab56-95250651fe12"). InnerVolumeSpecName "kube-api-access-q7nv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:32:06.285425 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.285392 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k"] Apr 20 16:32:06.285803 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.285788 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3475ccff-df0c-40d0-ab56-95250651fe12" containerName="manager" Apr 20 16:32:06.285866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.285805 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3475ccff-df0c-40d0-ab56-95250651fe12" containerName="manager" Apr 20 16:32:06.285866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.285854 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3475ccff-df0c-40d0-ab56-95250651fe12" containerName="manager" Apr 20 16:32:06.288927 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.288910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.317426 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.317393 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k"] Apr 20 16:32:06.329875 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.329854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghf7\" (UniqueName: \"kubernetes.io/projected/cea376d4-3d3c-47b7-bfd4-385ea615ab95-kube-api-access-gghf7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.330002 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.329892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cea376d4-3d3c-47b7-bfd4-385ea615ab95-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.330002 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.329963 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7nv5\" (UniqueName: \"kubernetes.io/projected/3475ccff-df0c-40d0-ab56-95250651fe12-kube-api-access-q7nv5\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:32:06.330002 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.329974 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3475ccff-df0c-40d0-ab56-95250651fe12-extensions-socket-volume\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:32:06.430593 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.430554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gghf7\" (UniqueName: \"kubernetes.io/projected/cea376d4-3d3c-47b7-bfd4-385ea615ab95-kube-api-access-gghf7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.430782 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.430619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cea376d4-3d3c-47b7-bfd4-385ea615ab95-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.430996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.430976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cea376d4-3d3c-47b7-bfd4-385ea615ab95-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.442612 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.442582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghf7\" (UniqueName: \"kubernetes.io/projected/cea376d4-3d3c-47b7-bfd4-385ea615ab95-kube-api-access-gghf7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-gmj9k\" (UID: \"cea376d4-3d3c-47b7-bfd4-385ea615ab95\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.599275 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.599184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:06.661429 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.661396 2577 generic.go:358] "Generic (PLEG): container finished" podID="3475ccff-df0c-40d0-ab56-95250651fe12" containerID="6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd" exitCode=0 Apr 20 16:32:06.661585 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.661469 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" Apr 20 16:32:06.661585 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.661473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" event={"ID":"3475ccff-df0c-40d0-ab56-95250651fe12","Type":"ContainerDied","Data":"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd"} Apr 20 16:32:06.661585 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.661569 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7" event={"ID":"3475ccff-df0c-40d0-ab56-95250651fe12","Type":"ContainerDied","Data":"f184f32e49460d09636e4aeb19add00b7c45f32a00f91d3a554bea69b245e59f"} Apr 20 16:32:06.661585 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.661589 2577 scope.go:117] "RemoveContainer" containerID="6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd" Apr 20 16:32:06.671214 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.671192 2577 scope.go:117] "RemoveContainer" containerID="6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd" Apr 20 16:32:06.671463 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:32:06.671445 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd\": container with ID starting with 6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd not found: ID does not exist" containerID="6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd" Apr 20 16:32:06.671500 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.671473 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd"} err="failed to get container status \"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd\": rpc error: code = NotFound desc = could not find container \"6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd\": container with ID starting with 6fb1bf56f12ae34cf0cb75b3bc8b32137a104dd814798815d8887a36fbfc8dfd not found: ID does not exist" Apr 20 16:32:06.685511 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.685483 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:32:06.690695 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.690649 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jpcl7"] Apr 20 16:32:06.732435 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:06.732407 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k"] Apr 20 16:32:06.735868 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:32:06.735843 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea376d4_3d3c_47b7_bfd4_385ea615ab95.slice/crio-ec1336edd6d419a500f78554219207e19101f1fcc713ae4351334015ff024380 WatchSource:0}: Error finding container ec1336edd6d419a500f78554219207e19101f1fcc713ae4351334015ff024380: Status 404 returned error can't find the container with id ec1336edd6d419a500f78554219207e19101f1fcc713ae4351334015ff024380 Apr 20 16:32:07.666447 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:07.666408 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" event={"ID":"cea376d4-3d3c-47b7-bfd4-385ea615ab95","Type":"ContainerStarted","Data":"06fc32464f0bed8109f587300990c836d76ed7925418053bfde5f99c6ce66b73"} Apr 20 16:32:07.666447 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:07.666449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" event={"ID":"cea376d4-3d3c-47b7-bfd4-385ea615ab95","Type":"ContainerStarted","Data":"ec1336edd6d419a500f78554219207e19101f1fcc713ae4351334015ff024380"} Apr 20 16:32:07.666980 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:07.666548 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:07.689256 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:07.689200 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" podStartSLOduration=1.6891848120000001 podStartE2EDuration="1.689184812s" podCreationTimestamp="2026-04-20 16:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:32:07.686623457 +0000 UTC m=+532.266193483" watchObservedRunningTime="2026-04-20 16:32:07.689184812 +0000 UTC m=+532.268754830" Apr 20 16:32:08.007748 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:08.007654 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3475ccff-df0c-40d0-ab56-95250651fe12" path="/var/lib/kubelet/pods/3475ccff-df0c-40d0-ab56-95250651fe12/volumes" Apr 20 16:32:18.673401 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:18.673365 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-gmj9k" Apr 20 16:32:37.495566 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.495536 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:37.503541 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.503515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:37.503720 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.503630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:37.506276 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.506253 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-cbfmp\"" Apr 20 16:32:37.586832 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.586797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w246m\" (UniqueName: \"kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m\") pod \"authorino-f99f4b5cd-9sjqw\" (UID: \"3ab81317-7b51-4597-9ab5-919d25ee2a84\") " pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:37.606550 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.606526 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:32:37.609897 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.609882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:32:37.614847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.614824 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:32:37.688076 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.688050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w246m\" (UniqueName: \"kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m\") pod \"authorino-f99f4b5cd-9sjqw\" (UID: \"3ab81317-7b51-4597-9ab5-919d25ee2a84\") " pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:37.688212 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.688114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvb8r\" (UniqueName: \"kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r\") pod \"authorino-7498df8756-rxjlx\" (UID: \"bcc4363e-9e23-4de4-ae6d-1a4c46128199\") " pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:32:37.696055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.696027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w246m\" (UniqueName: \"kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m\") pod \"authorino-f99f4b5cd-9sjqw\" (UID: \"3ab81317-7b51-4597-9ab5-919d25ee2a84\") " pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:37.788882 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.788803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvb8r\" (UniqueName: \"kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r\") pod \"authorino-7498df8756-rxjlx\" (UID: \"bcc4363e-9e23-4de4-ae6d-1a4c46128199\") " pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:32:37.797799 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.797766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvb8r\" (UniqueName: \"kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r\") pod \"authorino-7498df8756-rxjlx\" (UID: \"bcc4363e-9e23-4de4-ae6d-1a4c46128199\") " pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:32:37.815440 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.815416 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:37.920029 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.919998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:32:37.935941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:37.935913 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:37.938957 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:32:37.938929 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab81317_7b51_4597_9ab5_919d25ee2a84.slice/crio-08e78f66e225265d0c8f30348117520d0e30a3f6a95205ca903cb9048addef28 WatchSource:0}: Error finding container 08e78f66e225265d0c8f30348117520d0e30a3f6a95205ca903cb9048addef28: Status 404 returned error can't find the container with id 08e78f66e225265d0c8f30348117520d0e30a3f6a95205ca903cb9048addef28 Apr 20 16:32:38.041563 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:38.041538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:32:38.043750 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:32:38.043721 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc4363e_9e23_4de4_ae6d_1a4c46128199.slice/crio-e055152b8672202b1f92c3ea1f8cb85f5df0f65b877c37c0de1d4158a8df398e WatchSource:0}: Error finding container e055152b8672202b1f92c3ea1f8cb85f5df0f65b877c37c0de1d4158a8df398e: Status 404 returned error can't find the container with id e055152b8672202b1f92c3ea1f8cb85f5df0f65b877c37c0de1d4158a8df398e Apr 20 16:32:38.778885 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:38.778835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rxjlx" event={"ID":"bcc4363e-9e23-4de4-ae6d-1a4c46128199","Type":"ContainerStarted","Data":"e055152b8672202b1f92c3ea1f8cb85f5df0f65b877c37c0de1d4158a8df398e"} Apr 20 16:32:38.780520 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:38.780490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" event={"ID":"3ab81317-7b51-4597-9ab5-919d25ee2a84","Type":"ContainerStarted","Data":"08e78f66e225265d0c8f30348117520d0e30a3f6a95205ca903cb9048addef28"} Apr 20 16:32:40.792143 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:40.792086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" event={"ID":"3ab81317-7b51-4597-9ab5-919d25ee2a84","Type":"ContainerStarted","Data":"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa"} Apr 20 16:32:40.793935 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:40.793899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rxjlx" event={"ID":"bcc4363e-9e23-4de4-ae6d-1a4c46128199","Type":"ContainerStarted","Data":"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078"} Apr 20 16:32:40.807885 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:40.807841 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" podStartSLOduration=1.123262172 podStartE2EDuration="3.807828834s" podCreationTimestamp="2026-04-20 16:32:37 +0000 UTC" firstStartedPulling="2026-04-20 16:32:37.940459852 +0000 UTC m=+562.520029846" lastFinishedPulling="2026-04-20 16:32:40.625026514 +0000 UTC m=+565.204596508" observedRunningTime="2026-04-20 16:32:40.806403793 +0000 UTC m=+565.385973844" watchObservedRunningTime="2026-04-20 16:32:40.807828834 +0000 UTC m=+565.387398849" Apr 20 16:32:40.820350 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:40.820310 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-rxjlx" podStartSLOduration=1.22963243 podStartE2EDuration="3.820298845s" podCreationTimestamp="2026-04-20 16:32:37 +0000 UTC" firstStartedPulling="2026-04-20 16:32:38.045131888 +0000 UTC m=+562.624701881" lastFinishedPulling="2026-04-20 16:32:40.635798301 +0000 UTC m=+565.215368296" observedRunningTime="2026-04-20 16:32:40.819396177 +0000 UTC m=+565.398966204" watchObservedRunningTime="2026-04-20 16:32:40.820298845 +0000 UTC m=+565.399868860" Apr 20 16:32:40.840063 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:40.840035 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:42.800540 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:42.800503 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" podUID="3ab81317-7b51-4597-9ab5-919d25ee2a84" containerName="authorino" containerID="cri-o://d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa" gracePeriod=30 Apr 20 16:32:43.041791 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.041771 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:43.138643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.138608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w246m\" (UniqueName: \"kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m\") pod \"3ab81317-7b51-4597-9ab5-919d25ee2a84\" (UID: \"3ab81317-7b51-4597-9ab5-919d25ee2a84\") " Apr 20 16:32:43.140777 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.140714 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m" (OuterVolumeSpecName: "kube-api-access-w246m") pod "3ab81317-7b51-4597-9ab5-919d25ee2a84" (UID: "3ab81317-7b51-4597-9ab5-919d25ee2a84"). InnerVolumeSpecName "kube-api-access-w246m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:32:43.240239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.240211 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w246m\" (UniqueName: \"kubernetes.io/projected/3ab81317-7b51-4597-9ab5-919d25ee2a84-kube-api-access-w246m\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:32:43.804988 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.804956 2577 generic.go:358] "Generic (PLEG): container finished" podID="3ab81317-7b51-4597-9ab5-919d25ee2a84" containerID="d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa" exitCode=0 Apr 20 16:32:43.805374 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.805007 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" Apr 20 16:32:43.805374 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.805035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" event={"ID":"3ab81317-7b51-4597-9ab5-919d25ee2a84","Type":"ContainerDied","Data":"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa"} Apr 20 16:32:43.805374 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.805070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-9sjqw" event={"ID":"3ab81317-7b51-4597-9ab5-919d25ee2a84","Type":"ContainerDied","Data":"08e78f66e225265d0c8f30348117520d0e30a3f6a95205ca903cb9048addef28"} Apr 20 16:32:43.805374 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.805084 2577 scope.go:117] "RemoveContainer" containerID="d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa" Apr 20 16:32:43.813872 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.813856 2577 scope.go:117] "RemoveContainer" containerID="d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa" Apr 20 16:32:43.814136 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:32:43.814117 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa\": container with ID starting with d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa not found: ID does not exist" containerID="d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa" Apr 20 16:32:43.814210 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.814155 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa"} err="failed to get container status \"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa\": rpc error: code = NotFound desc = could not find container \"d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa\": container with ID starting with d9e5feb7fdf83f11fcabe72d97e83d50c9205c65afab27fcee8b1d7270cbaaaa not found: ID does not exist" Apr 20 16:32:43.825741 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.825714 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:43.829322 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:43.829300 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-9sjqw"] Apr 20 16:32:44.008201 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:32:44.008168 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab81317-7b51-4597-9ab5-919d25ee2a84" path="/var/lib/kubelet/pods/3ab81317-7b51-4597-9ab5-919d25ee2a84/volumes" Apr 20 16:33:15.912798 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:15.912764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:33:15.913302 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:15.912841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:33:15.915168 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:15.915145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:33:15.915294 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:15.915204 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:33:33.793947 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.793755 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:33:33.795021 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.794987 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ab81317-7b51-4597-9ab5-919d25ee2a84" containerName="authorino" Apr 20 16:33:33.795166 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.795154 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab81317-7b51-4597-9ab5-919d25ee2a84" containerName="authorino" Apr 20 16:33:33.795380 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.795353 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ab81317-7b51-4597-9ab5-919d25ee2a84" containerName="authorino" Apr 20 16:33:33.802730 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.802702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:33:33.803023 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.802997 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:33.806622 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.806598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-jh8rh\"" Apr 20 16:33:33.806757 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.806598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 16:33:33.806757 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.806700 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 16:33:33.806877 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.806708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 16:33:33.869166 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.869127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zt6s\" (UniqueName: \"kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s\") pod \"maas-keycloak-0\" (UID: \"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:33.970477 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.970448 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zt6s\" (UniqueName: \"kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s\") pod \"maas-keycloak-0\" (UID: \"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:33.979362 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:33.979335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zt6s\" (UniqueName: \"kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s\") pod \"maas-keycloak-0\" (UID: \"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:34.114786 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:34.114716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:34.236966 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:34.236939 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:33:34.239415 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:33:34.239384 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fb3cd6_e2b7_4ac1_8d8f_063eafd6c769.slice/crio-a3c652c6cb2af844d4c649f7fbcca4f8de8820b6157acd77d89434891b710ad0 WatchSource:0}: Error finding container a3c652c6cb2af844d4c649f7fbcca4f8de8820b6157acd77d89434891b710ad0: Status 404 returned error can't find the container with id a3c652c6cb2af844d4c649f7fbcca4f8de8820b6157acd77d89434891b710ad0 Apr 20 16:33:34.989105 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:34.989065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769","Type":"ContainerStarted","Data":"a3c652c6cb2af844d4c649f7fbcca4f8de8820b6157acd77d89434891b710ad0"} Apr 20 16:33:40.011464 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:40.011429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769","Type":"ContainerStarted","Data":"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485"} Apr 20 16:33:40.029502 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:40.029426 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.435282495 podStartE2EDuration="7.029402431s" podCreationTimestamp="2026-04-20 16:33:33 +0000 UTC" firstStartedPulling="2026-04-20 16:33:34.240952444 +0000 UTC m=+618.820522450" lastFinishedPulling="2026-04-20 16:33:39.835072376 +0000 UTC m=+624.414642386" observedRunningTime="2026-04-20 16:33:40.027797471 +0000 UTC m=+624.607367499" watchObservedRunningTime="2026-04-20 16:33:40.029402431 +0000 UTC m=+624.608972447" Apr 20 16:33:40.115023 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:40.114907 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:40.116978 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:40.116917 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:41.115546 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:41.115502 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:42.116056 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:42.116009 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:43.115827 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:43.115783 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:44.115044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:44.114990 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:44.115653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:44.115408 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:45.115433 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:45.115391 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:46.115704 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:46.115634 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:47.115439 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:47.115379 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:48.116031 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:48.115985 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:49.115432 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:49.115383 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:50.115554 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:50.115507 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:51.116222 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:51.116164 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:52.115753 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:52.115710 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.37:9000/health/started\": dial tcp 10.133.0.37:9000: connect: connection refused" Apr 20 16:33:53.227484 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:53.227451 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 16:33:53.248233 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:33:53.247964 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 16:34:03.234182 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:03.234153 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:04.180073 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.180042 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:04.190839 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.190809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:04.191740 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.191716 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:04.251976 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.251940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndvzs\" (UniqueName: \"kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs\") pod \"authorino-8b475cf9f-46lqc\" (UID: \"c18f9d64-ae0b-410c-818a-d14d887eaa5b\") " pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:04.352809 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.352774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndvzs\" (UniqueName: \"kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs\") pod \"authorino-8b475cf9f-46lqc\" (UID: \"c18f9d64-ae0b-410c-818a-d14d887eaa5b\") " pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:04.360503 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.360480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndvzs\" (UniqueName: \"kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs\") pod \"authorino-8b475cf9f-46lqc\" (UID: \"c18f9d64-ae0b-410c-818a-d14d887eaa5b\") " pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:04.424993 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.424953 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:04.425229 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.425216 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:04.448943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.448910 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-848d5f9444-g7b72"] Apr 20 16:34:04.454031 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.454009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:04.458159 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.458132 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-848d5f9444-g7b72"] Apr 20 16:34:04.537000 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.536971 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-848d5f9444-g7b72"] Apr 20 16:34:04.537264 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:04.537244 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-mh2j4], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-848d5f9444-g7b72" podUID="0e2b8a74-6e9c-4dc4-9377-9426f81e0820" Apr 20 16:34:04.554668 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.554556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2j4\" (UniqueName: \"kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4\") pod \"authorino-848d5f9444-g7b72\" (UID: \"0e2b8a74-6e9c-4dc4-9377-9426f81e0820\") " pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:04.556460 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.556439 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:04.559213 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:04.559188 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc18f9d64_ae0b_410c_818a_d14d887eaa5b.slice/crio-4497825d26fcb722acf0749abadf2075125c22b9011f1d356c51414405d50cfd WatchSource:0}: Error finding container 4497825d26fcb722acf0749abadf2075125c22b9011f1d356c51414405d50cfd: Status 404 returned error can't find the container with id 4497825d26fcb722acf0749abadf2075125c22b9011f1d356c51414405d50cfd Apr 20 16:34:04.563007 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.562981 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:34:04.566666 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.566631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.569258 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.569239 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 16:34:04.574836 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.574809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:34:04.655217 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.655182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2j4\" (UniqueName: \"kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4\") pod \"authorino-848d5f9444-g7b72\" (UID: \"0e2b8a74-6e9c-4dc4-9377-9426f81e0820\") " pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:04.655413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.655228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.655413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.655270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr6fh\" (UniqueName: \"kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.663897 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.663869 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2j4\" (UniqueName: \"kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4\") pod \"authorino-848d5f9444-g7b72\" (UID: \"0e2b8a74-6e9c-4dc4-9377-9426f81e0820\") " pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:04.755775 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.755689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.755775 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.755761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr6fh\" (UniqueName: \"kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.758250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.758225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.763847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.763820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr6fh\" (UniqueName: \"kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh\") pod \"authorino-54dd9fb6dd-6msg5\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:04.877150 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:04.877110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:34:05.002306 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.002277 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:34:05.015883 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:05.015855 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod221ebcdf_13db_4c93_b77a_b55c7a5ccb8a.slice/crio-d4074b6cfdc9c4deba063a1f440a5a3f09d4245f959230fcf4896a7acbb84634 WatchSource:0}: Error finding container d4074b6cfdc9c4deba063a1f440a5a3f09d4245f959230fcf4896a7acbb84634: Status 404 returned error can't find the container with id d4074b6cfdc9c4deba063a1f440a5a3f09d4245f959230fcf4896a7acbb84634 Apr 20 16:34:05.138340 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.138298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-46lqc" event={"ID":"c18f9d64-ae0b-410c-818a-d14d887eaa5b","Type":"ContainerStarted","Data":"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c"} Apr 20 16:34:05.138340 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.138347 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-46lqc" event={"ID":"c18f9d64-ae0b-410c-818a-d14d887eaa5b","Type":"ContainerStarted","Data":"4497825d26fcb722acf0749abadf2075125c22b9011f1d356c51414405d50cfd"} Apr 20 16:34:05.138611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.138364 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-46lqc" podUID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" containerName="authorino" containerID="cri-o://7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c" gracePeriod=30 Apr 20 16:34:05.139636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.139611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" event={"ID":"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a","Type":"ContainerStarted","Data":"d4074b6cfdc9c4deba063a1f440a5a3f09d4245f959230fcf4896a7acbb84634"} Apr 20 16:34:05.139765 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.139653 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:05.145060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.145042 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:05.169228 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.169184 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-46lqc" podStartSLOduration=0.748123769 podStartE2EDuration="1.169169703s" podCreationTimestamp="2026-04-20 16:34:04 +0000 UTC" firstStartedPulling="2026-04-20 16:34:04.560658236 +0000 UTC m=+649.140228243" lastFinishedPulling="2026-04-20 16:34:04.981704173 +0000 UTC m=+649.561274177" observedRunningTime="2026-04-20 16:34:05.168132536 +0000 UTC m=+649.747702573" watchObservedRunningTime="2026-04-20 16:34:05.169169703 +0000 UTC m=+649.748739717" Apr 20 16:34:05.260361 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.260334 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2j4\" (UniqueName: \"kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4\") pod \"0e2b8a74-6e9c-4dc4-9377-9426f81e0820\" (UID: \"0e2b8a74-6e9c-4dc4-9377-9426f81e0820\") " Apr 20 16:34:05.262482 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.262456 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4" (OuterVolumeSpecName: "kube-api-access-mh2j4") pod "0e2b8a74-6e9c-4dc4-9377-9426f81e0820" (UID: "0e2b8a74-6e9c-4dc4-9377-9426f81e0820"). InnerVolumeSpecName "kube-api-access-mh2j4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:05.361863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.361832 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mh2j4\" (UniqueName: \"kubernetes.io/projected/0e2b8a74-6e9c-4dc4-9377-9426f81e0820-kube-api-access-mh2j4\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:05.453488 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.453463 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:05.563229 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.563146 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndvzs\" (UniqueName: \"kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs\") pod \"c18f9d64-ae0b-410c-818a-d14d887eaa5b\" (UID: \"c18f9d64-ae0b-410c-818a-d14d887eaa5b\") " Apr 20 16:34:05.565317 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.565296 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs" (OuterVolumeSpecName: "kube-api-access-ndvzs") pod "c18f9d64-ae0b-410c-818a-d14d887eaa5b" (UID: "c18f9d64-ae0b-410c-818a-d14d887eaa5b"). InnerVolumeSpecName "kube-api-access-ndvzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:05.664001 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:05.663975 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndvzs\" (UniqueName: \"kubernetes.io/projected/c18f9d64-ae0b-410c-818a-d14d887eaa5b-kube-api-access-ndvzs\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:06.144407 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.144370 2577 generic.go:358] "Generic (PLEG): container finished" podID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" containerID="7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c" exitCode=0 Apr 20 16:34:06.144595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.144424 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-46lqc" Apr 20 16:34:06.144595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.144430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-46lqc" event={"ID":"c18f9d64-ae0b-410c-818a-d14d887eaa5b","Type":"ContainerDied","Data":"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c"} Apr 20 16:34:06.144595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.144474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-46lqc" event={"ID":"c18f9d64-ae0b-410c-818a-d14d887eaa5b","Type":"ContainerDied","Data":"4497825d26fcb722acf0749abadf2075125c22b9011f1d356c51414405d50cfd"} Apr 20 16:34:06.144595 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.144496 2577 scope.go:117] "RemoveContainer" containerID="7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c" Apr 20 16:34:06.145897 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.145872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" event={"ID":"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a","Type":"ContainerStarted","Data":"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d"} Apr 20 16:34:06.146011 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.145882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-848d5f9444-g7b72" Apr 20 16:34:06.153280 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.153256 2577 scope.go:117] "RemoveContainer" containerID="7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c" Apr 20 16:34:06.153525 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:06.153508 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c\": container with ID starting with 7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c not found: ID does not exist" containerID="7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c" Apr 20 16:34:06.153589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.153533 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c"} err="failed to get container status \"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c\": rpc error: code = NotFound desc = could not find container \"7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c\": container with ID starting with 7a11b4b72633cc859ed4c8b9ae78eaa29cdbbf9aebf9f00c2bca07631482ed7c not found: ID does not exist" Apr 20 16:34:06.163585 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.163545 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" podStartSLOduration=1.816589555 podStartE2EDuration="2.163534629s" podCreationTimestamp="2026-04-20 16:34:04 +0000 UTC" firstStartedPulling="2026-04-20 16:34:05.017253216 +0000 UTC m=+649.596823208" lastFinishedPulling="2026-04-20 16:34:05.364198288 +0000 UTC m=+649.943768282" observedRunningTime="2026-04-20 16:34:06.162557215 +0000 UTC m=+650.742127230" watchObservedRunningTime="2026-04-20 16:34:06.163534629 +0000 UTC m=+650.743104644" Apr 20 16:34:06.196994 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.196964 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:34:06.197164 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.197145 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-rxjlx" podUID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" containerName="authorino" containerID="cri-o://be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078" gracePeriod=30 Apr 20 16:34:06.202219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.202190 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-848d5f9444-g7b72"] Apr 20 16:34:06.209429 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.209402 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-848d5f9444-g7b72"] Apr 20 16:34:06.220943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.220920 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:06.227837 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.227813 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-46lqc"] Apr 20 16:34:06.444788 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.444769 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:34:06.574067 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.573969 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvb8r\" (UniqueName: \"kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r\") pod \"bcc4363e-9e23-4de4-ae6d-1a4c46128199\" (UID: \"bcc4363e-9e23-4de4-ae6d-1a4c46128199\") " Apr 20 16:34:06.576160 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.576131 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r" (OuterVolumeSpecName: "kube-api-access-qvb8r") pod "bcc4363e-9e23-4de4-ae6d-1a4c46128199" (UID: "bcc4363e-9e23-4de4-ae6d-1a4c46128199"). InnerVolumeSpecName "kube-api-access-qvb8r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:06.675482 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:06.675427 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvb8r\" (UniqueName: \"kubernetes.io/projected/bcc4363e-9e23-4de4-ae6d-1a4c46128199-kube-api-access-qvb8r\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:07.149957 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.149927 2577 generic.go:358] "Generic (PLEG): container finished" podID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" containerID="be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078" exitCode=0 Apr 20 16:34:07.150122 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.149976 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rxjlx" Apr 20 16:34:07.150122 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.149988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rxjlx" event={"ID":"bcc4363e-9e23-4de4-ae6d-1a4c46128199","Type":"ContainerDied","Data":"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078"} Apr 20 16:34:07.150122 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.150011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rxjlx" event={"ID":"bcc4363e-9e23-4de4-ae6d-1a4c46128199","Type":"ContainerDied","Data":"e055152b8672202b1f92c3ea1f8cb85f5df0f65b877c37c0de1d4158a8df398e"} Apr 20 16:34:07.150122 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.150028 2577 scope.go:117] "RemoveContainer" containerID="be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078" Apr 20 16:34:07.158455 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.158438 2577 scope.go:117] "RemoveContainer" containerID="be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078" Apr 20 16:34:07.158728 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:07.158701 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078\": container with ID starting with be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078 not found: ID does not exist" containerID="be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078" Apr 20 16:34:07.158793 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.158733 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078"} err="failed to get container status \"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078\": rpc error: code = NotFound desc = could not find container \"be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078\": container with ID starting with be9c6c450a878060743cabb018b97a83330eb84754f7ce63cadca7962dcf5078 not found: ID does not exist" Apr 20 16:34:07.182549 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.182522 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:34:07.187764 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.187740 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-rxjlx"] Apr 20 16:34:07.202799 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.202774 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:07.203307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203277 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" containerName="authorino" Apr 20 16:34:07.203307 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203304 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" containerName="authorino" Apr 20 16:34:07.203437 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203346 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" containerName="authorino" Apr 20 16:34:07.203437 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203354 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" containerName="authorino" Apr 20 16:34:07.203437 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203428 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" containerName="authorino" Apr 20 16:34:07.203542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.203440 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" containerName="authorino" Apr 20 16:34:07.207550 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.207536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:07.210228 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.210211 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-w2frx\"" Apr 20 16:34:07.224543 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.221608 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:07.281369 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.281331 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p8ss\" (UniqueName: \"kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss\") pod \"maas-controller-6d4c8f55f9-plwcc\" (UID: \"166a2ba7-c013-4eda-8c86-59df80f62107\") " pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:07.354716 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.354671 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:07.358237 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.358218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:07.367130 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.367108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:07.382286 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.382260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p8ss\" (UniqueName: \"kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss\") pod \"maas-controller-6d4c8f55f9-plwcc\" (UID: \"166a2ba7-c013-4eda-8c86-59df80f62107\") " pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:07.390555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.390528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p8ss\" (UniqueName: \"kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss\") pod \"maas-controller-6d4c8f55f9-plwcc\" (UID: \"166a2ba7-c013-4eda-8c86-59df80f62107\") " pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:07.482816 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.482751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvpzk\" (UniqueName: \"kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk\") pod \"maas-controller-fdd784b45-k28mq\" (UID: \"a5276993-be94-403b-b288-d558b61c7cde\") " pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:07.486020 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.485999 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:07.486273 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.486259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:07.515060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.515033 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:07.519611 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.519592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:07.526030 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.525974 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:07.583259 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.583213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvpzk\" (UniqueName: \"kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk\") pod \"maas-controller-fdd784b45-k28mq\" (UID: \"a5276993-be94-403b-b288-d558b61c7cde\") " pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:07.583435 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.583296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnz5g\" (UniqueName: \"kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g\") pod \"maas-controller-55c94d5bc8-9tfzn\" (UID: \"18794aa7-8eb3-42f5-a265-547536b9c363\") " pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:07.591526 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.591500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvpzk\" (UniqueName: \"kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk\") pod \"maas-controller-fdd784b45-k28mq\" (UID: \"a5276993-be94-403b-b288-d558b61c7cde\") " pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:07.618296 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.618271 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:07.621318 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:07.621294 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166a2ba7_c013_4eda_8c86_59df80f62107.slice/crio-31ea9566c7eb880222f60ab3d4badc2e974f247fd1d698576178f846b409cd1e WatchSource:0}: Error finding container 31ea9566c7eb880222f60ab3d4badc2e974f247fd1d698576178f846b409cd1e: Status 404 returned error can't find the container with id 31ea9566c7eb880222f60ab3d4badc2e974f247fd1d698576178f846b409cd1e Apr 20 16:34:07.669510 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.669482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:07.684480 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.684453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnz5g\" (UniqueName: \"kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g\") pod \"maas-controller-55c94d5bc8-9tfzn\" (UID: \"18794aa7-8eb3-42f5-a265-547536b9c363\") " pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:07.692528 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.692500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnz5g\" (UniqueName: \"kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g\") pod \"maas-controller-55c94d5bc8-9tfzn\" (UID: \"18794aa7-8eb3-42f5-a265-547536b9c363\") " pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:07.789562 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.789538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:07.792248 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:07.792219 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5276993_be94_403b_b288_d558b61c7cde.slice/crio-6cde3a531b5f6c3a686af611be398a87ed299f95f1a1686cb4cec5f91fc47377 WatchSource:0}: Error finding container 6cde3a531b5f6c3a686af611be398a87ed299f95f1a1686cb4cec5f91fc47377: Status 404 returned error can't find the container with id 6cde3a531b5f6c3a686af611be398a87ed299f95f1a1686cb4cec5f91fc47377 Apr 20 16:34:07.831567 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.831526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:07.956508 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:07.956484 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:07.958612 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:07.958580 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18794aa7_8eb3_42f5_a265_547536b9c363.slice/crio-8be79bad731861cb47b3e0aa13255b125e7579092e6f2a34738988bd914b5e2c WatchSource:0}: Error finding container 8be79bad731861cb47b3e0aa13255b125e7579092e6f2a34738988bd914b5e2c: Status 404 returned error can't find the container with id 8be79bad731861cb47b3e0aa13255b125e7579092e6f2a34738988bd914b5e2c Apr 20 16:34:08.009303 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.009227 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2b8a74-6e9c-4dc4-9377-9426f81e0820" path="/var/lib/kubelet/pods/0e2b8a74-6e9c-4dc4-9377-9426f81e0820/volumes" Apr 20 16:34:08.009467 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.009455 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc4363e-9e23-4de4-ae6d-1a4c46128199" path="/var/lib/kubelet/pods/bcc4363e-9e23-4de4-ae6d-1a4c46128199/volumes" Apr 20 16:34:08.009759 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.009747 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18f9d64-ae0b-410c-818a-d14d887eaa5b" path="/var/lib/kubelet/pods/c18f9d64-ae0b-410c-818a-d14d887eaa5b/volumes" Apr 20 16:34:08.156050 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.156011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fdd784b45-k28mq" event={"ID":"a5276993-be94-403b-b288-d558b61c7cde","Type":"ContainerStarted","Data":"6cde3a531b5f6c3a686af611be398a87ed299f95f1a1686cb4cec5f91fc47377"} Apr 20 16:34:08.157248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.157216 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" event={"ID":"166a2ba7-c013-4eda-8c86-59df80f62107","Type":"ContainerStarted","Data":"31ea9566c7eb880222f60ab3d4badc2e974f247fd1d698576178f846b409cd1e"} Apr 20 16:34:08.159218 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:08.159193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" event={"ID":"18794aa7-8eb3-42f5-a265-547536b9c363","Type":"ContainerStarted","Data":"8be79bad731861cb47b3e0aa13255b125e7579092e6f2a34738988bd914b5e2c"} Apr 20 16:34:11.174189 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.174147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" event={"ID":"18794aa7-8eb3-42f5-a265-547536b9c363","Type":"ContainerStarted","Data":"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52"} Apr 20 16:34:11.174656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.174232 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:11.175932 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.175893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fdd784b45-k28mq" event={"ID":"a5276993-be94-403b-b288-d558b61c7cde","Type":"ContainerStarted","Data":"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8"} Apr 20 16:34:11.176312 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.176175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:11.177541 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.177517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" event={"ID":"166a2ba7-c013-4eda-8c86-59df80f62107","Type":"ContainerStarted","Data":"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631"} Apr 20 16:34:11.178895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.178869 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:11.178895 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.177593 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" podUID="166a2ba7-c013-4eda-8c86-59df80f62107" containerName="manager" containerID="cri-o://86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631" gracePeriod=10 Apr 20 16:34:11.194616 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.194570 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" podStartSLOduration=1.089153326 podStartE2EDuration="4.194556612s" podCreationTimestamp="2026-04-20 16:34:07 +0000 UTC" firstStartedPulling="2026-04-20 16:34:07.959974783 +0000 UTC m=+652.539544777" lastFinishedPulling="2026-04-20 16:34:11.065378051 +0000 UTC m=+655.644948063" observedRunningTime="2026-04-20 16:34:11.192191725 +0000 UTC m=+655.771761741" watchObservedRunningTime="2026-04-20 16:34:11.194556612 +0000 UTC m=+655.774126627" Apr 20 16:34:11.207733 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.207665 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" podStartSLOduration=0.775610834 podStartE2EDuration="4.20765306s" podCreationTimestamp="2026-04-20 16:34:07 +0000 UTC" firstStartedPulling="2026-04-20 16:34:07.622647734 +0000 UTC m=+652.202217727" lastFinishedPulling="2026-04-20 16:34:11.054689947 +0000 UTC m=+655.634259953" observedRunningTime="2026-04-20 16:34:11.206140728 +0000 UTC m=+655.785710742" watchObservedRunningTime="2026-04-20 16:34:11.20765306 +0000 UTC m=+655.787223074" Apr 20 16:34:11.220418 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.220381 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-fdd784b45-k28mq" podStartSLOduration=0.958960637 podStartE2EDuration="4.220370503s" podCreationTimestamp="2026-04-20 16:34:07 +0000 UTC" firstStartedPulling="2026-04-20 16:34:07.793600117 +0000 UTC m=+652.373170111" lastFinishedPulling="2026-04-20 16:34:11.055009979 +0000 UTC m=+655.634579977" observedRunningTime="2026-04-20 16:34:11.219374422 +0000 UTC m=+655.798944474" watchObservedRunningTime="2026-04-20 16:34:11.220370503 +0000 UTC m=+655.799940518" Apr 20 16:34:11.431453 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.431427 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:11.522227 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.522195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p8ss\" (UniqueName: \"kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss\") pod \"166a2ba7-c013-4eda-8c86-59df80f62107\" (UID: \"166a2ba7-c013-4eda-8c86-59df80f62107\") " Apr 20 16:34:11.524428 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.524397 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss" (OuterVolumeSpecName: "kube-api-access-6p8ss") pod "166a2ba7-c013-4eda-8c86-59df80f62107" (UID: "166a2ba7-c013-4eda-8c86-59df80f62107"). InnerVolumeSpecName "kube-api-access-6p8ss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:11.623322 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:11.623244 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p8ss\" (UniqueName: \"kubernetes.io/projected/166a2ba7-c013-4eda-8c86-59df80f62107-kube-api-access-6p8ss\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:12.182876 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.182824 2577 generic.go:358] "Generic (PLEG): container finished" podID="166a2ba7-c013-4eda-8c86-59df80f62107" containerID="86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631" exitCode=2 Apr 20 16:34:12.182876 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.182880 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" Apr 20 16:34:12.183402 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.182888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" event={"ID":"166a2ba7-c013-4eda-8c86-59df80f62107","Type":"ContainerDied","Data":"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631"} Apr 20 16:34:12.183402 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.182925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-plwcc" event={"ID":"166a2ba7-c013-4eda-8c86-59df80f62107","Type":"ContainerDied","Data":"31ea9566c7eb880222f60ab3d4badc2e974f247fd1d698576178f846b409cd1e"} Apr 20 16:34:12.183402 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.182942 2577 scope.go:117] "RemoveContainer" containerID="86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631" Apr 20 16:34:12.191670 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.191653 2577 scope.go:117] "RemoveContainer" containerID="86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631" Apr 20 16:34:12.191938 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:12.191920 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631\": container with ID starting with 86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631 not found: ID does not exist" containerID="86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631" Apr 20 16:34:12.191991 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.191946 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631"} err="failed to get container status \"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631\": rpc error: code = NotFound desc = could not find container \"86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631\": container with ID starting with 86b2be61ceaa9a297fed2f210ebff886c8400675af24ac2f782e76a91c9e9631 not found: ID does not exist" Apr 20 16:34:12.202959 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.202934 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:12.204868 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:12.204850 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-plwcc"] Apr 20 16:34:13.463560 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.463531 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:13.464007 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.463993 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="166a2ba7-c013-4eda-8c86-59df80f62107" containerName="manager" Apr 20 16:34:13.464047 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.464009 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="166a2ba7-c013-4eda-8c86-59df80f62107" containerName="manager" Apr 20 16:34:13.464100 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.464090 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="166a2ba7-c013-4eda-8c86-59df80f62107" containerName="manager" Apr 20 16:34:13.468631 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.468615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:13.471048 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.471023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 16:34:13.471164 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.471023 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 16:34:13.471164 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.471036 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-sxs7n\"" Apr 20 16:34:13.475961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.475925 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:13.540626 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.540597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjcj\" (UniqueName: \"kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:13.540626 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.540632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:13.642064 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.642034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjcj\" (UniqueName: \"kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:13.642239 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.642132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:13.642311 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:13.642289 2577 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 20 16:34:13.642378 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:13.642367 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls podName:f1e39e3f-1c76-472b-8edc-764c04e5d8e0 nodeName:}" failed. No retries permitted until 2026-04-20 16:34:14.142345266 +0000 UTC m=+658.721915267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls") pod "maas-api-d849bbf6d-ftflw" (UID: "f1e39e3f-1c76-472b-8edc-764c04e5d8e0") : secret "maas-api-serving-cert" not found Apr 20 16:34:13.651133 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:13.651106 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjcj\" (UniqueName: \"kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:14.008492 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:14.008465 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166a2ba7-c013-4eda-8c86-59df80f62107" path="/var/lib/kubelet/pods/166a2ba7-c013-4eda-8c86-59df80f62107/volumes" Apr 20 16:34:14.147394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:14.147350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:14.149910 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:14.149887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") pod \"maas-api-d849bbf6d-ftflw\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:14.380909 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:14.380874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:14.515076 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:14.515052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:14.517294 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:14.517263 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1e39e3f_1c76_472b_8edc_764c04e5d8e0.slice/crio-554cad9fe0a7e055d5d601f829c2365b1453242a8ed8aa272d2e8269461e839e WatchSource:0}: Error finding container 554cad9fe0a7e055d5d601f829c2365b1453242a8ed8aa272d2e8269461e839e: Status 404 returned error can't find the container with id 554cad9fe0a7e055d5d601f829c2365b1453242a8ed8aa272d2e8269461e839e Apr 20 16:34:15.196406 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:15.196369 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d849bbf6d-ftflw" event={"ID":"f1e39e3f-1c76-472b-8edc-764c04e5d8e0","Type":"ContainerStarted","Data":"554cad9fe0a7e055d5d601f829c2365b1453242a8ed8aa272d2e8269461e839e"} Apr 20 16:34:16.386941 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:16.386917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 16:34:17.205021 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:17.204986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d849bbf6d-ftflw" event={"ID":"f1e39e3f-1c76-472b-8edc-764c04e5d8e0","Type":"ContainerStarted","Data":"ee06e8fc8292ff3fc6d6df7d190a49893fe59dfe4d2154f811104f73c7d70ca6"} Apr 20 16:34:17.205169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:17.205052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:17.222727 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:17.222672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-d849bbf6d-ftflw" podStartSLOduration=2.357114394 podStartE2EDuration="4.222658634s" podCreationTimestamp="2026-04-20 16:34:13 +0000 UTC" firstStartedPulling="2026-04-20 16:34:14.518812116 +0000 UTC m=+659.098382125" lastFinishedPulling="2026-04-20 16:34:16.384356368 +0000 UTC m=+660.963926365" observedRunningTime="2026-04-20 16:34:17.220618569 +0000 UTC m=+661.800188611" watchObservedRunningTime="2026-04-20 16:34:17.222658634 +0000 UTC m=+661.802228649" Apr 20 16:34:22.189063 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.189035 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:22.189454 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.189081 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:22.248453 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.248414 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:22.248713 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.248656 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-fdd784b45-k28mq" podUID="a5276993-be94-403b-b288-d558b61c7cde" containerName="manager" containerID="cri-o://9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8" gracePeriod=10 Apr 20 16:34:22.494200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.494174 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:22.545707 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.545652 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-856df6f54f-zjt6b"] Apr 20 16:34:22.546125 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.546107 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5276993-be94-403b-b288-d558b61c7cde" containerName="manager" Apr 20 16:34:22.546219 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.546129 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5276993-be94-403b-b288-d558b61c7cde" containerName="manager" Apr 20 16:34:22.546275 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.546251 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5276993-be94-403b-b288-d558b61c7cde" containerName="manager" Apr 20 16:34:22.549473 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.549450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:22.555982 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.555958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-856df6f54f-zjt6b"] Apr 20 16:34:22.623960 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.623934 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvpzk\" (UniqueName: \"kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk\") pod \"a5276993-be94-403b-b288-d558b61c7cde\" (UID: \"a5276993-be94-403b-b288-d558b61c7cde\") " Apr 20 16:34:22.624126 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.624109 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ht6\" (UniqueName: \"kubernetes.io/projected/bb074fbc-b45f-40f9-884a-d3a0131b2ac4-kube-api-access-b6ht6\") pod \"maas-controller-856df6f54f-zjt6b\" (UID: \"bb074fbc-b45f-40f9-884a-d3a0131b2ac4\") " pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:22.626166 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.626143 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk" (OuterVolumeSpecName: "kube-api-access-qvpzk") pod "a5276993-be94-403b-b288-d558b61c7cde" (UID: "a5276993-be94-403b-b288-d558b61c7cde"). InnerVolumeSpecName "kube-api-access-qvpzk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:22.725245 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.725215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ht6\" (UniqueName: \"kubernetes.io/projected/bb074fbc-b45f-40f9-884a-d3a0131b2ac4-kube-api-access-b6ht6\") pod \"maas-controller-856df6f54f-zjt6b\" (UID: \"bb074fbc-b45f-40f9-884a-d3a0131b2ac4\") " pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:22.725367 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.725312 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvpzk\" (UniqueName: \"kubernetes.io/projected/a5276993-be94-403b-b288-d558b61c7cde-kube-api-access-qvpzk\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:22.733557 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.733500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ht6\" (UniqueName: \"kubernetes.io/projected/bb074fbc-b45f-40f9-884a-d3a0131b2ac4-kube-api-access-b6ht6\") pod \"maas-controller-856df6f54f-zjt6b\" (UID: \"bb074fbc-b45f-40f9-884a-d3a0131b2ac4\") " pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:22.861351 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.861324 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:22.983299 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.983274 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-856df6f54f-zjt6b"] Apr 20 16:34:22.985626 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:22.985558 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb074fbc_b45f_40f9_884a_d3a0131b2ac4.slice/crio-0697ba5aeeff580f6f0a183c6c975a9be9cf883f8e235d2a773ce8cd81e538a6 WatchSource:0}: Error finding container 0697ba5aeeff580f6f0a183c6c975a9be9cf883f8e235d2a773ce8cd81e538a6: Status 404 returned error can't find the container with id 0697ba5aeeff580f6f0a183c6c975a9be9cf883f8e235d2a773ce8cd81e538a6 Apr 20 16:34:22.986907 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:22.986890 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:34:23.214555 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.214530 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:23.226943 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.226905 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-856df6f54f-zjt6b" event={"ID":"bb074fbc-b45f-40f9-884a-d3a0131b2ac4","Type":"ContainerStarted","Data":"0697ba5aeeff580f6f0a183c6c975a9be9cf883f8e235d2a773ce8cd81e538a6"} Apr 20 16:34:23.227998 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.227974 2577 generic.go:358] "Generic (PLEG): container finished" podID="a5276993-be94-403b-b288-d558b61c7cde" containerID="9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8" exitCode=0 Apr 20 16:34:23.228118 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.228029 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-fdd784b45-k28mq" Apr 20 16:34:23.228118 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.228065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fdd784b45-k28mq" event={"ID":"a5276993-be94-403b-b288-d558b61c7cde","Type":"ContainerDied","Data":"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8"} Apr 20 16:34:23.228118 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.228102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-fdd784b45-k28mq" event={"ID":"a5276993-be94-403b-b288-d558b61c7cde","Type":"ContainerDied","Data":"6cde3a531b5f6c3a686af611be398a87ed299f95f1a1686cb4cec5f91fc47377"} Apr 20 16:34:23.228118 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.228117 2577 scope.go:117] "RemoveContainer" containerID="9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8" Apr 20 16:34:23.237921 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.237864 2577 scope.go:117] "RemoveContainer" containerID="9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8" Apr 20 16:34:23.238203 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:23.238185 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8\": container with ID starting with 9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8 not found: ID does not exist" containerID="9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8" Apr 20 16:34:23.238254 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.238211 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8"} err="failed to get container status \"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8\": rpc error: code = NotFound desc = could not find container \"9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8\": container with ID starting with 9cca2812c6c31f3426efe3081c20873e2e61b72afd06a2aedececcb292a0bad8 not found: ID does not exist" Apr 20 16:34:23.254726 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.254667 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:23.258363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:23.258342 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-fdd784b45-k28mq"] Apr 20 16:34:24.009309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:24.009281 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5276993-be94-403b-b288-d558b61c7cde" path="/var/lib/kubelet/pods/a5276993-be94-403b-b288-d558b61c7cde/volumes" Apr 20 16:34:24.232907 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:24.232864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-856df6f54f-zjt6b" event={"ID":"bb074fbc-b45f-40f9-884a-d3a0131b2ac4","Type":"ContainerStarted","Data":"a18371414f7d7f213a9941fd5f8ca8d64a14060e5901ea500019cfb2e7135eb7"} Apr 20 16:34:24.233322 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:24.232963 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:24.249128 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:24.249087 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-856df6f54f-zjt6b" podStartSLOduration=1.902563016 podStartE2EDuration="2.249074032s" podCreationTimestamp="2026-04-20 16:34:22 +0000 UTC" firstStartedPulling="2026-04-20 16:34:22.987018381 +0000 UTC m=+667.566588374" lastFinishedPulling="2026-04-20 16:34:23.333529384 +0000 UTC m=+667.913099390" observedRunningTime="2026-04-20 16:34:24.247869251 +0000 UTC m=+668.827439265" watchObservedRunningTime="2026-04-20 16:34:24.249074032 +0000 UTC m=+668.828644047" Apr 20 16:34:35.242101 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.242075 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-856df6f54f-zjt6b" Apr 20 16:34:35.280586 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.280542 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:35.280833 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.280808 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" podUID="18794aa7-8eb3-42f5-a265-547536b9c363" containerName="manager" containerID="cri-o://d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52" gracePeriod=10 Apr 20 16:34:35.524458 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.524436 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:35.531291 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.531271 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnz5g\" (UniqueName: \"kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g\") pod \"18794aa7-8eb3-42f5-a265-547536b9c363\" (UID: \"18794aa7-8eb3-42f5-a265-547536b9c363\") " Apr 20 16:34:35.533451 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.533427 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g" (OuterVolumeSpecName: "kube-api-access-nnz5g") pod "18794aa7-8eb3-42f5-a265-547536b9c363" (UID: "18794aa7-8eb3-42f5-a265-547536b9c363"). InnerVolumeSpecName "kube-api-access-nnz5g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:35.632133 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:35.632102 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnz5g\" (UniqueName: \"kubernetes.io/projected/18794aa7-8eb3-42f5-a265-547536b9c363-kube-api-access-nnz5g\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:36.275356 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.275270 2577 generic.go:358] "Generic (PLEG): container finished" podID="18794aa7-8eb3-42f5-a265-547536b9c363" containerID="d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52" exitCode=0 Apr 20 16:34:36.275356 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.275320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" event={"ID":"18794aa7-8eb3-42f5-a265-547536b9c363","Type":"ContainerDied","Data":"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52"} Apr 20 16:34:36.275825 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.275362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" event={"ID":"18794aa7-8eb3-42f5-a265-547536b9c363","Type":"ContainerDied","Data":"8be79bad731861cb47b3e0aa13255b125e7579092e6f2a34738988bd914b5e2c"} Apr 20 16:34:36.275825 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.275379 2577 scope.go:117] "RemoveContainer" containerID="d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52" Apr 20 16:34:36.275825 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.275339 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55c94d5bc8-9tfzn" Apr 20 16:34:36.284034 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.283874 2577 scope.go:117] "RemoveContainer" containerID="d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52" Apr 20 16:34:36.284142 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:36.284125 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52\": container with ID starting with d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52 not found: ID does not exist" containerID="d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52" Apr 20 16:34:36.284183 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.284150 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52"} err="failed to get container status \"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52\": rpc error: code = NotFound desc = could not find container \"d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52\": container with ID starting with d4315f466d298a81f6481d5f97521dad1730381b685cef80d897b49c43dc0e52 not found: ID does not exist" Apr 20 16:34:36.293774 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.293747 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:36.296870 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.296850 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-55c94d5bc8-9tfzn"] Apr 20 16:34:36.583690 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.583599 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:36.583863 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:36.583832 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" containerID="cri-o://8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485" gracePeriod=30 Apr 20 16:34:38.008766 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.008735 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18794aa7-8eb3-42f5-a265-547536b9c363" path="/var/lib/kubelet/pods/18794aa7-8eb3-42f5-a265-547536b9c363/volumes" Apr 20 16:34:38.234024 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.234004 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.254770 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.254188 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zt6s\" (UniqueName: \"kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s\") pod \"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769\" (UID: \"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769\") " Apr 20 16:34:38.261624 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.261590 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s" (OuterVolumeSpecName: "kube-api-access-2zt6s") pod "23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" (UID: "23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769"). InnerVolumeSpecName "kube-api-access-2zt6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:38.287200 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.287163 2577 generic.go:358] "Generic (PLEG): container finished" podID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerID="8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485" exitCode=143 Apr 20 16:34:38.287362 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.287319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769","Type":"ContainerDied","Data":"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485"} Apr 20 16:34:38.287420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.287362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769","Type":"ContainerDied","Data":"a3c652c6cb2af844d4c649f7fbcca4f8de8820b6157acd77d89434891b710ad0"} Apr 20 16:34:38.287420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.287380 2577 scope.go:117] "RemoveContainer" containerID="8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485" Apr 20 16:34:38.287420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.287333 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.298293 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.298272 2577 scope.go:117] "RemoveContainer" containerID="8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485" Apr 20 16:34:38.298549 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:34:38.298530 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485\": container with ID starting with 8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485 not found: ID does not exist" containerID="8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485" Apr 20 16:34:38.298612 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.298560 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485"} err="failed to get container status \"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485\": rpc error: code = NotFound desc = could not find container \"8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485\": container with ID starting with 8c2cbe7447f19eebf7ff0cca55e01e3699e39280e253b6a6ee3ad907c0709485 not found: ID does not exist" Apr 20 16:34:38.310044 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.310024 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:38.313831 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.313812 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:38.334181 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334157 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:38.334640 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334622 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" Apr 20 16:34:38.334713 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334644 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" Apr 20 16:34:38.334713 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334699 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18794aa7-8eb3-42f5-a265-547536b9c363" containerName="manager" Apr 20 16:34:38.334713 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334708 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="18794aa7-8eb3-42f5-a265-547536b9c363" containerName="manager" Apr 20 16:34:38.334824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334796 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" containerName="keycloak" Apr 20 16:34:38.334824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.334814 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="18794aa7-8eb3-42f5-a265-547536b9c363" containerName="manager" Apr 20 16:34:38.361336 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.358350 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zt6s\" (UniqueName: \"kubernetes.io/projected/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769-kube-api-access-2zt6s\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:38.361479 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.359016 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:38.361479 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.359146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.364464 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.364441 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 16:34:38.365172 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.364723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 16:34:38.365172 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.364929 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 16:34:38.365172 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.364954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-jh8rh\"" Apr 20 16:34:38.365172 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.365126 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 20 16:34:38.461982 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.461953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2a5cdc57-229e-4991-a0e3-60e772dd9062-test-realms\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.462130 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.461990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vj7\" (UniqueName: \"kubernetes.io/projected/2a5cdc57-229e-4991-a0e3-60e772dd9062-kube-api-access-t5vj7\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.562928 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.562853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2a5cdc57-229e-4991-a0e3-60e772dd9062-test-realms\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.562928 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.562888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vj7\" (UniqueName: \"kubernetes.io/projected/2a5cdc57-229e-4991-a0e3-60e772dd9062-kube-api-access-t5vj7\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.563476 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.563454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/2a5cdc57-229e-4991-a0e3-60e772dd9062-test-realms\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.570893 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.570874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vj7\" (UniqueName: \"kubernetes.io/projected/2a5cdc57-229e-4991-a0e3-60e772dd9062-kube-api-access-t5vj7\") pod \"maas-keycloak-0\" (UID: \"2a5cdc57-229e-4991-a0e3-60e772dd9062\") " pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.674934 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.674905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:38.797438 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:38.797411 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 16:34:38.799924 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:34:38.799888 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a5cdc57_229e_4991_a0e3_60e772dd9062.slice/crio-bab41ae9ea7b0d034af31e05bd7c42bd670448c63dd3b251ba1c29572ff0a26d WatchSource:0}: Error finding container bab41ae9ea7b0d034af31e05bd7c42bd670448c63dd3b251ba1c29572ff0a26d: Status 404 returned error can't find the container with id bab41ae9ea7b0d034af31e05bd7c42bd670448c63dd3b251ba1c29572ff0a26d Apr 20 16:34:39.292282 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:39.292232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"2a5cdc57-229e-4991-a0e3-60e772dd9062","Type":"ContainerStarted","Data":"bab41ae9ea7b0d034af31e05bd7c42bd670448c63dd3b251ba1c29572ff0a26d"} Apr 20 16:34:40.008653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:40.008613 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769" path="/var/lib/kubelet/pods/23fb3cd6-e2b7-4ac1-8d8f-063eafd6c769/volumes" Apr 20 16:34:40.299739 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:40.299622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"2a5cdc57-229e-4991-a0e3-60e772dd9062","Type":"ContainerStarted","Data":"65e550204fdffdbd850c75038a207dda5530eb46858e10fa8ba0de3e9fa49265"} Apr 20 16:34:40.675519 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:40.675477 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:40.677250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:40.677210 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:41.675590 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:41.675537 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:42.675788 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:42.675739 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:43.675431 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:43.675383 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:44.676453 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:44.676394 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:45.676173 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:45.676125 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:46.676208 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:46.676151 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:47.676175 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:47.676132 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:48.676265 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:48.675597 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:48.676265 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:48.675891 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:49.675749 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:49.675697 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:50.676146 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:50.676097 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:51.675814 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:51.675762 2577 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.46:9000/health/started\": dial tcp 10.133.0.46:9000: connect: connection refused" Apr 20 16:34:52.106499 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.106385 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=13.596763863 podStartE2EDuration="14.106323115s" podCreationTimestamp="2026-04-20 16:34:38 +0000 UTC" firstStartedPulling="2026-04-20 16:34:38.801261664 +0000 UTC m=+683.380831658" lastFinishedPulling="2026-04-20 16:34:39.310820913 +0000 UTC m=+683.890390910" observedRunningTime="2026-04-20 16:34:40.319120808 +0000 UTC m=+684.898690824" watchObservedRunningTime="2026-04-20 16:34:52.106323115 +0000 UTC m=+696.685893131" Apr 20 16:34:52.109088 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.109060 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:52.109542 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.109508 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-d849bbf6d-ftflw" podUID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" containerName="maas-api" containerID="cri-o://ee06e8fc8292ff3fc6d6df7d190a49893fe59dfe4d2154f811104f73c7d70ca6" gracePeriod=30 Apr 20 16:34:52.365920 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.365892 2577 generic.go:358] "Generic (PLEG): container finished" podID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" containerID="ee06e8fc8292ff3fc6d6df7d190a49893fe59dfe4d2154f811104f73c7d70ca6" exitCode=0 Apr 20 16:34:52.366050 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.365976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d849bbf6d-ftflw" event={"ID":"f1e39e3f-1c76-472b-8edc-764c04e5d8e0","Type":"ContainerDied","Data":"ee06e8fc8292ff3fc6d6df7d190a49893fe59dfe4d2154f811104f73c7d70ca6"} Apr 20 16:34:52.390272 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.390245 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:52.505915 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.505868 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") pod \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " Apr 20 16:34:52.506081 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.505981 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrjcj\" (UniqueName: \"kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj\") pod \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\" (UID: \"f1e39e3f-1c76-472b-8edc-764c04e5d8e0\") " Apr 20 16:34:52.508165 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.508131 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj" (OuterVolumeSpecName: "kube-api-access-zrjcj") pod "f1e39e3f-1c76-472b-8edc-764c04e5d8e0" (UID: "f1e39e3f-1c76-472b-8edc-764c04e5d8e0"). InnerVolumeSpecName "kube-api-access-zrjcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:34:52.508165 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.508152 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "f1e39e3f-1c76-472b-8edc-764c04e5d8e0" (UID: "f1e39e3f-1c76-472b-8edc-764c04e5d8e0"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:34:52.607206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.607155 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrjcj\" (UniqueName: \"kubernetes.io/projected/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-kube-api-access-zrjcj\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:52.607206 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.607202 2577 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f1e39e3f-1c76-472b-8edc-764c04e5d8e0-maas-api-tls\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:34:52.769295 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.769258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 16:34:52.785834 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:52.785785 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="2a5cdc57-229e-4991-a0e3-60e772dd9062" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 16:34:53.370664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:53.370622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d849bbf6d-ftflw" event={"ID":"f1e39e3f-1c76-472b-8edc-764c04e5d8e0","Type":"ContainerDied","Data":"554cad9fe0a7e055d5d601f829c2365b1453242a8ed8aa272d2e8269461e839e"} Apr 20 16:34:53.370664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:53.370665 2577 scope.go:117] "RemoveContainer" containerID="ee06e8fc8292ff3fc6d6df7d190a49893fe59dfe4d2154f811104f73c7d70ca6" Apr 20 16:34:53.371248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:53.370803 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d849bbf6d-ftflw" Apr 20 16:34:53.395193 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:53.395153 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:53.398010 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:53.397983 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-d849bbf6d-ftflw"] Apr 20 16:34:54.009065 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:34:54.009032 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" path="/var/lib/kubelet/pods/f1e39e3f-1c76-472b-8edc-764c04e5d8e0/volumes" Apr 20 16:35:02.776263 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:02.776227 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 16:35:13.466363 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.466292 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:35:13.466810 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.466790 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" containerName="maas-api" Apr 20 16:35:13.466894 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.466812 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" containerName="maas-api" Apr 20 16:35:13.466935 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.466899 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1e39e3f-1c76-472b-8edc-764c04e5d8e0" containerName="maas-api" Apr 20 16:35:13.470028 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.470014 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.472438 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.472417 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 20 16:35:13.475899 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.475879 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:35:13.606736 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.606669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.606736 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.606734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrch5\" (UniqueName: \"kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.606942 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.606804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.707823 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.707787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.707996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.707911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.707996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.707943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrch5\" (UniqueName: \"kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.708368 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.708349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.710390 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.710373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.718896 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.718837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrch5\" (UniqueName: \"kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5\") pod \"authorino-dbbb77764-zcs62\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.779988 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.779966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:35:13.909249 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:13.909218 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:35:13.914988 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:35:13.914943 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e6493f5_fe1e_4aab_8b49_866aea075e8c.slice/crio-ab24619b3f5147b8e92d682795552cff5fcf39c610c69b44607d695c4f8c20c0 WatchSource:0}: Error finding container ab24619b3f5147b8e92d682795552cff5fcf39c610c69b44607d695c4f8c20c0: Status 404 returned error can't find the container with id ab24619b3f5147b8e92d682795552cff5fcf39c610c69b44607d695c4f8c20c0 Apr 20 16:35:14.452865 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.452825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbbb77764-zcs62" event={"ID":"4e6493f5-fe1e-4aab-8b49-866aea075e8c","Type":"ContainerStarted","Data":"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f"} Apr 20 16:35:14.453022 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.452875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbbb77764-zcs62" event={"ID":"4e6493f5-fe1e-4aab-8b49-866aea075e8c","Type":"ContainerStarted","Data":"ab24619b3f5147b8e92d682795552cff5fcf39c610c69b44607d695c4f8c20c0"} Apr 20 16:35:14.470847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.470775 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-dbbb77764-zcs62" podStartSLOduration=1.052845553 podStartE2EDuration="1.470752688s" podCreationTimestamp="2026-04-20 16:35:13 +0000 UTC" firstStartedPulling="2026-04-20 16:35:13.916654633 +0000 UTC m=+718.496224627" lastFinishedPulling="2026-04-20 16:35:14.334561767 +0000 UTC m=+718.914131762" observedRunningTime="2026-04-20 16:35:14.467699213 +0000 UTC m=+719.047269225" watchObservedRunningTime="2026-04-20 16:35:14.470752688 +0000 UTC m=+719.050322704" Apr 20 16:35:14.495364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.495330 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:35:14.495545 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.495524 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" podUID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" containerName="authorino" containerID="cri-o://d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d" gracePeriod=30 Apr 20 16:35:14.736061 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.736040 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:35:14.919098 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.919066 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr6fh\" (UniqueName: \"kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh\") pod \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " Apr 20 16:35:14.919282 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.919150 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert\") pod \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\" (UID: \"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a\") " Apr 20 16:35:14.921628 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.921586 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh" (OuterVolumeSpecName: "kube-api-access-gr6fh") pod "221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" (UID: "221ebcdf-13db-4c93-b77a-b55c7a5ccb8a"). InnerVolumeSpecName "kube-api-access-gr6fh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:35:14.930243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:14.930217 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" (UID: "221ebcdf-13db-4c93-b77a-b55c7a5ccb8a"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:35:15.020589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.020540 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gr6fh\" (UniqueName: \"kubernetes.io/projected/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-kube-api-access-gr6fh\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:35:15.020589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.020583 2577 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a-tls-cert\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:35:15.457140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.457109 2577 generic.go:358] "Generic (PLEG): container finished" podID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" containerID="d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d" exitCode=0 Apr 20 16:35:15.457310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.457158 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" Apr 20 16:35:15.457310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.457175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" event={"ID":"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a","Type":"ContainerDied","Data":"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d"} Apr 20 16:35:15.457310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.457209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54dd9fb6dd-6msg5" event={"ID":"221ebcdf-13db-4c93-b77a-b55c7a5ccb8a","Type":"ContainerDied","Data":"d4074b6cfdc9c4deba063a1f440a5a3f09d4245f959230fcf4896a7acbb84634"} Apr 20 16:35:15.457310 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.457228 2577 scope.go:117] "RemoveContainer" containerID="d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d" Apr 20 16:35:15.466255 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.466237 2577 scope.go:117] "RemoveContainer" containerID="d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d" Apr 20 16:35:15.466496 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:35:15.466477 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d\": container with ID starting with d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d not found: ID does not exist" containerID="d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d" Apr 20 16:35:15.466548 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.466503 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d"} err="failed to get container status \"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d\": rpc error: code = NotFound desc = could not find container \"d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d\": container with ID starting with d46d50a7b48473cb04c679a5fea1137d8b58284fedafc94271fe6063b212934d not found: ID does not exist" Apr 20 16:35:15.478300 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.478280 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:35:15.483833 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:15.483814 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-54dd9fb6dd-6msg5"] Apr 20 16:35:16.009227 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:16.009198 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" path="/var/lib/kubelet/pods/221ebcdf-13db-4c93-b77a-b55c7a5ccb8a/volumes" Apr 20 16:35:28.480797 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.480765 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn"] Apr 20 16:35:28.481305 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.481251 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" containerName="authorino" Apr 20 16:35:28.481305 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.481266 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" containerName="authorino" Apr 20 16:35:28.481405 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.481345 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="221ebcdf-13db-4c93-b77a-b55c7a5ccb8a" containerName="authorino" Apr 20 16:35:28.486692 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.486662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.491729 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.491481 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gr55f\"" Apr 20 16:35:28.491729 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.491481 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 16:35:28.491729 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.491536 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 16:35:28.491729 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.491484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 16:35:28.494743 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.494711 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn"] Apr 20 16:35:28.520164 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.520309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.520309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.520309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshqj\" (UniqueName: \"kubernetes.io/projected/f0082a39-7641-463d-a7fc-199411456274-kube-api-access-dshqj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.520309 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0082a39-7641-463d-a7fc-199411456274-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.520462 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.520346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621123 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0082a39-7641-463d-a7fc-199411456274-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621280 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621280 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621280 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dshqj\" (UniqueName: \"kubernetes.io/projected/f0082a39-7641-463d-a7fc-199411456274-kube-api-access-dshqj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621719 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621824 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.621879 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.621788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.623608 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.623583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f0082a39-7641-463d-a7fc-199411456274-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.623846 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.623826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f0082a39-7641-463d-a7fc-199411456274-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.628946 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.628923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshqj\" (UniqueName: \"kubernetes.io/projected/f0082a39-7641-463d-a7fc-199411456274-kube-api-access-dshqj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn\" (UID: \"f0082a39-7641-463d-a7fc-199411456274\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.797656 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.797582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:28.923905 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:28.923863 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn"] Apr 20 16:35:28.930606 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:35:28.930573 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0082a39_7641_463d_a7fc_199411456274.slice/crio-c186e9709a9f6dd2c62673b21d61d0cbea84deca63678f69b47aa577ea441380 WatchSource:0}: Error finding container c186e9709a9f6dd2c62673b21d61d0cbea84deca63678f69b47aa577ea441380: Status 404 returned error can't find the container with id c186e9709a9f6dd2c62673b21d61d0cbea84deca63678f69b47aa577ea441380 Apr 20 16:35:29.510600 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:29.510563 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" event={"ID":"f0082a39-7641-463d-a7fc-199411456274","Type":"ContainerStarted","Data":"c186e9709a9f6dd2c62673b21d61d0cbea84deca63678f69b47aa577ea441380"} Apr 20 16:35:36.536533 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:36.536496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" event={"ID":"f0082a39-7641-463d-a7fc-199411456274","Type":"ContainerStarted","Data":"ccb75ef7c23197d67944dc4b58078495996ac7a9f648fd6af984c265ab11d131"} Apr 20 16:35:39.678696 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.678649 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9"] Apr 20 16:35:39.682481 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.682462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.685121 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.685102 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 16:35:39.692540 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.692511 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9"] Apr 20 16:35:39.704511 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/969797f2-5913-46e7-af16-654367d780b7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.704646 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.704646 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.704794 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvpt\" (UniqueName: \"kubernetes.io/projected/969797f2-5913-46e7-af16-654367d780b7-kube-api-access-nvvpt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.704794 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.704794 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.704776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805448 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvpt\" (UniqueName: \"kubernetes.io/projected/969797f2-5913-46e7-af16-654367d780b7-kube-api-access-nvvpt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/969797f2-5913-46e7-af16-654367d780b7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805643 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.805992 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.805964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.806072 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.806047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.806153 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.806131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.808398 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.808370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/969797f2-5913-46e7-af16-654367d780b7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.817160 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.817135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvpt\" (UniqueName: \"kubernetes.io/projected/969797f2-5913-46e7-af16-654367d780b7-kube-api-access-nvvpt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.817411 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.817385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/969797f2-5913-46e7-af16-654367d780b7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9\" (UID: \"969797f2-5913-46e7-af16-654367d780b7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:39.994426 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:39.994341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:40.137553 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:40.137523 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9"] Apr 20 16:35:40.140480 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:35:40.140436 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969797f2_5913_46e7_af16_654367d780b7.slice/crio-c80d80e62b93659b82a8ee92068991a10732ff74ff60ef715a09e771abf82828 WatchSource:0}: Error finding container c80d80e62b93659b82a8ee92068991a10732ff74ff60ef715a09e771abf82828: Status 404 returned error can't find the container with id c80d80e62b93659b82a8ee92068991a10732ff74ff60ef715a09e771abf82828 Apr 20 16:35:40.553243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:40.553201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" event={"ID":"969797f2-5913-46e7-af16-654367d780b7","Type":"ContainerStarted","Data":"06a5d71a6a37028426d5bc324d9b464f9c697c37d82857d53ec872cba1f6dc2b"} Apr 20 16:35:40.553243 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:40.553239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" event={"ID":"969797f2-5913-46e7-af16-654367d780b7","Type":"ContainerStarted","Data":"c80d80e62b93659b82a8ee92068991a10732ff74ff60ef715a09e771abf82828"} Apr 20 16:35:44.572202 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:44.572166 2577 generic.go:358] "Generic (PLEG): container finished" podID="f0082a39-7641-463d-a7fc-199411456274" containerID="ccb75ef7c23197d67944dc4b58078495996ac7a9f648fd6af984c265ab11d131" exitCode=0 Apr 20 16:35:44.572664 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:44.572229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" event={"ID":"f0082a39-7641-463d-a7fc-199411456274","Type":"ContainerDied","Data":"ccb75ef7c23197d67944dc4b58078495996ac7a9f648fd6af984c265ab11d131"} Apr 20 16:35:46.580947 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:46.580908 2577 generic.go:358] "Generic (PLEG): container finished" podID="969797f2-5913-46e7-af16-654367d780b7" containerID="06a5d71a6a37028426d5bc324d9b464f9c697c37d82857d53ec872cba1f6dc2b" exitCode=0 Apr 20 16:35:46.581360 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:46.580985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" event={"ID":"969797f2-5913-46e7-af16-654367d780b7","Type":"ContainerDied","Data":"06a5d71a6a37028426d5bc324d9b464f9c697c37d82857d53ec872cba1f6dc2b"} Apr 20 16:35:46.582985 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:46.582954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" event={"ID":"f0082a39-7641-463d-a7fc-199411456274","Type":"ContainerStarted","Data":"767111a90782c175694079669e20811607c4cc53b4e3dfec5915d46dbfc8cedb"} Apr 20 16:35:46.583249 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:46.583195 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:46.613346 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:46.613301 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" podStartSLOduration=1.8511070090000001 podStartE2EDuration="18.613288057s" podCreationTimestamp="2026-04-20 16:35:28 +0000 UTC" firstStartedPulling="2026-04-20 16:35:28.932801628 +0000 UTC m=+733.512371622" lastFinishedPulling="2026-04-20 16:35:45.694982675 +0000 UTC m=+750.274552670" observedRunningTime="2026-04-20 16:35:46.61250959 +0000 UTC m=+751.192079608" watchObservedRunningTime="2026-04-20 16:35:46.613288057 +0000 UTC m=+751.192858051" Apr 20 16:35:47.589211 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:47.589178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" event={"ID":"969797f2-5913-46e7-af16-654367d780b7","Type":"ContainerStarted","Data":"7b1fbbef97f33e4112f2f094c03d97119538d7bc6ca4cf5a6bf0ba9538bae2e5"} Apr 20 16:35:47.589605 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:47.589459 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:47.608649 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:47.608608 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" podStartSLOduration=8.355809469 podStartE2EDuration="8.608596328s" podCreationTimestamp="2026-04-20 16:35:39 +0000 UTC" firstStartedPulling="2026-04-20 16:35:46.581778573 +0000 UTC m=+751.161348566" lastFinishedPulling="2026-04-20 16:35:46.834565428 +0000 UTC m=+751.414135425" observedRunningTime="2026-04-20 16:35:47.606193542 +0000 UTC m=+752.185763560" watchObservedRunningTime="2026-04-20 16:35:47.608596328 +0000 UTC m=+752.188166343" Apr 20 16:35:53.478182 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.478149 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc"] Apr 20 16:35:53.525250 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.525222 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc"] Apr 20 16:35:53.525405 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.525342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.527937 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.527914 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 16:35:53.639515 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.639673 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.639673 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.639673 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.639673 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9mh\" (UniqueName: \"kubernetes.io/projected/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kube-api-access-pn9mh\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.639867 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.639754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741005 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.740917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741169 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9mh\" (UniqueName: \"kubernetes.io/projected/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kube-api-access-pn9mh\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741424 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741482 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741421 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.741482 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.741438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.743489 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.743470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.743773 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.743755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.748806 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.748774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9mh\" (UniqueName: \"kubernetes.io/projected/a813eb2f-e6ce-40fe-8df8-f0ebf785afbb-kube-api-access-pn9mh\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc\" (UID: \"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.836175 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.836152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:35:53.961497 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:53.961468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc"] Apr 20 16:35:53.963792 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:35:53.963760 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda813eb2f_e6ce_40fe_8df8_f0ebf785afbb.slice/crio-82368a42ac1cfdb437c3e14f66e375af543ff2003afc4be767eba43665ad3365 WatchSource:0}: Error finding container 82368a42ac1cfdb437c3e14f66e375af543ff2003afc4be767eba43665ad3365: Status 404 returned error can't find the container with id 82368a42ac1cfdb437c3e14f66e375af543ff2003afc4be767eba43665ad3365 Apr 20 16:35:54.613553 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:54.613520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" event={"ID":"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb","Type":"ContainerStarted","Data":"f692cdec14b4995d3a616fda0f39f9d48e4f5008dac57c5dcd864fe9d903515b"} Apr 20 16:35:54.613553 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:54.613560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" event={"ID":"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb","Type":"ContainerStarted","Data":"82368a42ac1cfdb437c3e14f66e375af543ff2003afc4be767eba43665ad3365"} Apr 20 16:35:57.273410 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.273326 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x"] Apr 20 16:35:57.351995 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.351958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x"] Apr 20 16:35:57.352149 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.352091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.354650 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.354629 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 16:35:57.479185 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vphh4\" (UniqueName: \"kubernetes.io/projected/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kube-api-access-vphh4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.479364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.479364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.479364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.479474 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.479474 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.479383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.580867 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vphh4\" (UniqueName: \"kubernetes.io/projected/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kube-api-access-vphh4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.580867 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581082 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.580979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.581377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581420 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.581387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.581603 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.581446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.583425 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.583401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.583670 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.583655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.588967 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.588939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vphh4\" (UniqueName: \"kubernetes.io/projected/2a3bd77b-8e17-4737-8b3e-e63ef9b44d69-kube-api-access-vphh4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-pqr2x\" (UID: \"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.603110 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.603087 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn" Apr 20 16:35:57.664826 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.664796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:35:57.812650 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:57.812624 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x"] Apr 20 16:35:57.815762 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:35:57.815726 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3bd77b_8e17_4737_8b3e_e63ef9b44d69.slice/crio-7fa5f5a106930d8f15bcc1fc17b6b6b481c106253931e38efa33f43951949d06 WatchSource:0}: Error finding container 7fa5f5a106930d8f15bcc1fc17b6b6b481c106253931e38efa33f43951949d06: Status 404 returned error can't find the container with id 7fa5f5a106930d8f15bcc1fc17b6b6b481c106253931e38efa33f43951949d06 Apr 20 16:35:58.606295 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:58.606265 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9" Apr 20 16:35:58.632946 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:58.632905 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" event={"ID":"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69","Type":"ContainerStarted","Data":"5cd72c2c2ef4c2f55c953e69c9128a8d23d3db1cf578ade804712b00b5dc7189"} Apr 20 16:35:58.632946 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:58.632953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" event={"ID":"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69","Type":"ContainerStarted","Data":"7fa5f5a106930d8f15bcc1fc17b6b6b481c106253931e38efa33f43951949d06"} Apr 20 16:35:59.776276 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.776240 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r"] Apr 20 16:35:59.801024 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.800979 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r"] Apr 20 16:35:59.801198 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.801135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.803887 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.803863 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 16:35:59.917811 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.917765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a93f366-72f5-4596-9633-9bc8ce8c6c54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.917811 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.917813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89xt6\" (UniqueName: \"kubernetes.io/projected/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kube-api-access-89xt6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.918040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.917880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.918040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.917955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.918040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.918004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:35:59.918040 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:35:59.918034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019345 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019345 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a93f366-72f5-4596-9633-9bc8ce8c6c54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019589 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89xt6\" (UniqueName: \"kubernetes.io/projected/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kube-api-access-89xt6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019781 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019882 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019779 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019882 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.019952 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.019895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.021958 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.021933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a93f366-72f5-4596-9633-9bc8ce8c6c54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.022404 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.022383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a93f366-72f5-4596-9633-9bc8ce8c6c54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.027912 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.027886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89xt6\" (UniqueName: \"kubernetes.io/projected/7a93f366-72f5-4596-9633-9bc8ce8c6c54-kube-api-access-89xt6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r\" (UID: \"7a93f366-72f5-4596-9633-9bc8ce8c6c54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.112690 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.112635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:00.182245 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.180459 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm"] Apr 20 16:36:00.209499 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.209408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.209957 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.209247 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm"] Apr 20 16:36:00.212602 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.212580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 16:36:00.261377 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.261346 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r"] Apr 20 16:36:00.264599 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:36:00.264567 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a93f366_72f5_4596_9633_9bc8ce8c6c54.slice/crio-fcfadcf874293912f84c4864ff631506be02121ce5689b0454e6af0e075f9a2e WatchSource:0}: Error finding container fcfadcf874293912f84c4864ff631506be02121ce5689b0454e6af0e075f9a2e: Status 404 returned error can't find the container with id fcfadcf874293912f84c4864ff631506be02121ce5689b0454e6af0e075f9a2e Apr 20 16:36:00.323572 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqfs8\" (UniqueName: \"kubernetes.io/projected/6b5d78ba-c775-4925-b774-200eb77c8093-kube-api-access-sqfs8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.323702 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323616 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.323702 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5d78ba-c775-4925-b774-200eb77c8093-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.323702 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.323812 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.323812 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.323756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424402 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqfs8\" (UniqueName: \"kubernetes.io/projected/6b5d78ba-c775-4925-b774-200eb77c8093-kube-api-access-sqfs8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424552 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424552 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5d78ba-c775-4925-b774-200eb77c8093-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424552 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424552 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424791 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.424985 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.425079 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.424988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.425143 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.425076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.426899 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.426877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b5d78ba-c775-4925-b774-200eb77c8093-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.427151 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.427136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b5d78ba-c775-4925-b774-200eb77c8093-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.432915 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.432887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqfs8\" (UniqueName: \"kubernetes.io/projected/6b5d78ba-c775-4925-b774-200eb77c8093-kube-api-access-sqfs8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm\" (UID: \"6b5d78ba-c775-4925-b774-200eb77c8093\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.522821 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.522788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:00.642607 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.642569 2577 generic.go:358] "Generic (PLEG): container finished" podID="a813eb2f-e6ce-40fe-8df8-f0ebf785afbb" containerID="f692cdec14b4995d3a616fda0f39f9d48e4f5008dac57c5dcd864fe9d903515b" exitCode=0 Apr 20 16:36:00.642811 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.642640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" event={"ID":"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb","Type":"ContainerDied","Data":"f692cdec14b4995d3a616fda0f39f9d48e4f5008dac57c5dcd864fe9d903515b"} Apr 20 16:36:00.645109 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.645062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" event={"ID":"7a93f366-72f5-4596-9633-9bc8ce8c6c54","Type":"ContainerStarted","Data":"e8359627743639f3a6510561951ce3f503d40ce55006d1c7bc03e745f180e5bd"} Apr 20 16:36:00.645109 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.645094 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" event={"ID":"7a93f366-72f5-4596-9633-9bc8ce8c6c54","Type":"ContainerStarted","Data":"fcfadcf874293912f84c4864ff631506be02121ce5689b0454e6af0e075f9a2e"} Apr 20 16:36:00.672060 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:00.672003 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm"] Apr 20 16:36:00.676726 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:36:00.676652 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5d78ba_c775_4925_b774_200eb77c8093.slice/crio-ebd65f50cbbf6f0978c595f8949378b842e6fbbe47d5e9e62238bdf51e2a0585 WatchSource:0}: Error finding container ebd65f50cbbf6f0978c595f8949378b842e6fbbe47d5e9e62238bdf51e2a0585: Status 404 returned error can't find the container with id ebd65f50cbbf6f0978c595f8949378b842e6fbbe47d5e9e62238bdf51e2a0585 Apr 20 16:36:01.655238 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:01.654358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" event={"ID":"6b5d78ba-c775-4925-b774-200eb77c8093","Type":"ContainerStarted","Data":"30a4148c953af9ed514e25eaca63e5d0d7bb1c0d1ce56904bfe01b27bfd74e18"} Apr 20 16:36:01.655238 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:01.654401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" event={"ID":"6b5d78ba-c775-4925-b774-200eb77c8093","Type":"ContainerStarted","Data":"ebd65f50cbbf6f0978c595f8949378b842e6fbbe47d5e9e62238bdf51e2a0585"} Apr 20 16:36:01.658337 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:01.658308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" event={"ID":"a813eb2f-e6ce-40fe-8df8-f0ebf785afbb","Type":"ContainerStarted","Data":"60982baa16b79f59240808524cbbfa5df403a46c25879ddec854f57e4710f440"} Apr 20 16:36:01.662874 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:01.662847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:36:01.693810 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:01.693740 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" podStartSLOduration=8.242892551 podStartE2EDuration="8.693718104s" podCreationTimestamp="2026-04-20 16:35:53 +0000 UTC" firstStartedPulling="2026-04-20 16:36:00.643479634 +0000 UTC m=+765.223049626" lastFinishedPulling="2026-04-20 16:36:01.094305165 +0000 UTC m=+765.673875179" observedRunningTime="2026-04-20 16:36:01.693074206 +0000 UTC m=+766.272644222" watchObservedRunningTime="2026-04-20 16:36:01.693718104 +0000 UTC m=+766.273288120" Apr 20 16:36:04.675560 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:04.675526 2577 generic.go:358] "Generic (PLEG): container finished" podID="2a3bd77b-8e17-4737-8b3e-e63ef9b44d69" containerID="5cd72c2c2ef4c2f55c953e69c9128a8d23d3db1cf578ade804712b00b5dc7189" exitCode=0 Apr 20 16:36:04.675961 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:04.675619 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" event={"ID":"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69","Type":"ContainerDied","Data":"5cd72c2c2ef4c2f55c953e69c9128a8d23d3db1cf578ade804712b00b5dc7189"} Apr 20 16:36:05.681112 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:05.681077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" event={"ID":"2a3bd77b-8e17-4737-8b3e-e63ef9b44d69","Type":"ContainerStarted","Data":"f003629873076ddbc43db906c6c66e22a085fd04cb8f325f52fa049e325e3b6c"} Apr 20 16:36:05.681500 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:05.681285 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:36:05.699648 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:05.699606 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" podStartSLOduration=8.335619753 podStartE2EDuration="8.699592349s" podCreationTimestamp="2026-04-20 16:35:57 +0000 UTC" firstStartedPulling="2026-04-20 16:36:04.676534843 +0000 UTC m=+769.256104839" lastFinishedPulling="2026-04-20 16:36:05.040507436 +0000 UTC m=+769.620077435" observedRunningTime="2026-04-20 16:36:05.696865128 +0000 UTC m=+770.276435143" watchObservedRunningTime="2026-04-20 16:36:05.699592349 +0000 UTC m=+770.279162363" Apr 20 16:36:06.686248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:06.686203 2577 generic.go:358] "Generic (PLEG): container finished" podID="7a93f366-72f5-4596-9633-9bc8ce8c6c54" containerID="e8359627743639f3a6510561951ce3f503d40ce55006d1c7bc03e745f180e5bd" exitCode=0 Apr 20 16:36:06.686653 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:06.686283 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" event={"ID":"7a93f366-72f5-4596-9633-9bc8ce8c6c54","Type":"ContainerDied","Data":"e8359627743639f3a6510561951ce3f503d40ce55006d1c7bc03e745f180e5bd"} Apr 20 16:36:06.687746 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:06.687725 2577 generic.go:358] "Generic (PLEG): container finished" podID="6b5d78ba-c775-4925-b774-200eb77c8093" containerID="30a4148c953af9ed514e25eaca63e5d0d7bb1c0d1ce56904bfe01b27bfd74e18" exitCode=0 Apr 20 16:36:06.687829 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:06.687804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" event={"ID":"6b5d78ba-c775-4925-b774-200eb77c8093","Type":"ContainerDied","Data":"30a4148c953af9ed514e25eaca63e5d0d7bb1c0d1ce56904bfe01b27bfd74e18"} Apr 20 16:36:07.696020 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.695985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" event={"ID":"6b5d78ba-c775-4925-b774-200eb77c8093","Type":"ContainerStarted","Data":"b94b255d040df2a70efdeb2620fb3cfc24a346aa06e8c2637c174b231aa54e2b"} Apr 20 16:36:07.696395 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.696208 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:07.697571 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.697551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" event={"ID":"7a93f366-72f5-4596-9633-9bc8ce8c6c54","Type":"ContainerStarted","Data":"17976036623f21999fefa59eeb432be0338ceaa736c1bdd6140e3b99bd8d5bbb"} Apr 20 16:36:07.697751 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.697737 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:07.714583 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.714540 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" podStartSLOduration=7.498650156 podStartE2EDuration="7.714529177s" podCreationTimestamp="2026-04-20 16:36:00 +0000 UTC" firstStartedPulling="2026-04-20 16:36:06.688444119 +0000 UTC m=+771.268014114" lastFinishedPulling="2026-04-20 16:36:06.904323142 +0000 UTC m=+771.483893135" observedRunningTime="2026-04-20 16:36:07.712396351 +0000 UTC m=+772.291966365" watchObservedRunningTime="2026-04-20 16:36:07.714529177 +0000 UTC m=+772.294099192" Apr 20 16:36:07.729979 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:07.729944 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" podStartSLOduration=8.501086103 podStartE2EDuration="8.729932315s" podCreationTimestamp="2026-04-20 16:35:59 +0000 UTC" firstStartedPulling="2026-04-20 16:36:06.687077514 +0000 UTC m=+771.266647509" lastFinishedPulling="2026-04-20 16:36:06.915923715 +0000 UTC m=+771.495493721" observedRunningTime="2026-04-20 16:36:07.727883647 +0000 UTC m=+772.307453673" watchObservedRunningTime="2026-04-20 16:36:07.729932315 +0000 UTC m=+772.309502331" Apr 20 16:36:12.675728 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:12.675666 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc" Apr 20 16:36:16.701024 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:16.700995 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-pqr2x" Apr 20 16:36:18.714549 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:18.714520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm" Apr 20 16:36:18.715362 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:18.715342 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r" Apr 20 16:36:55.612192 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.612150 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:36:55.612752 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.612368 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-dbbb77764-zcs62" podUID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" containerName="authorino" containerID="cri-o://87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f" gracePeriod=30 Apr 20 16:36:55.870337 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.870283 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:36:55.885163 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.885138 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" containerID="87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f" exitCode=0 Apr 20 16:36:55.885285 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.885192 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbbb77764-zcs62" Apr 20 16:36:55.885285 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.885227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbbb77764-zcs62" event={"ID":"4e6493f5-fe1e-4aab-8b49-866aea075e8c","Type":"ContainerDied","Data":"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f"} Apr 20 16:36:55.885285 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.885262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbbb77764-zcs62" event={"ID":"4e6493f5-fe1e-4aab-8b49-866aea075e8c","Type":"ContainerDied","Data":"ab24619b3f5147b8e92d682795552cff5fcf39c610c69b44607d695c4f8c20c0"} Apr 20 16:36:55.885285 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.885283 2577 scope.go:117] "RemoveContainer" containerID="87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f" Apr 20 16:36:55.895478 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.895455 2577 scope.go:117] "RemoveContainer" containerID="87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f" Apr 20 16:36:55.895790 ip-10-0-135-200 kubenswrapper[2577]: E0420 16:36:55.895765 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f\": container with ID starting with 87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f not found: ID does not exist" containerID="87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f" Apr 20 16:36:55.895885 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.895802 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f"} err="failed to get container status \"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f\": rpc error: code = NotFound desc = could not find container \"87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f\": container with ID starting with 87d1ac81429bb92686c059f39db36be894cf4f96e31832a61c92331d9440db6f not found: ID does not exist" Apr 20 16:36:55.970715 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.970673 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca\") pod \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " Apr 20 16:36:55.970866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.970741 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrch5\" (UniqueName: \"kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5\") pod \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " Apr 20 16:36:55.970866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.970780 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert\") pod \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\" (UID: \"4e6493f5-fe1e-4aab-8b49-866aea075e8c\") " Apr 20 16:36:55.972853 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.972828 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5" (OuterVolumeSpecName: "kube-api-access-hrch5") pod "4e6493f5-fe1e-4aab-8b49-866aea075e8c" (UID: "4e6493f5-fe1e-4aab-8b49-866aea075e8c"). InnerVolumeSpecName "kube-api-access-hrch5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:36:55.975889 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.975866 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "4e6493f5-fe1e-4aab-8b49-866aea075e8c" (UID: "4e6493f5-fe1e-4aab-8b49-866aea075e8c"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:36:55.983088 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:55.983068 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "4e6493f5-fe1e-4aab-8b49-866aea075e8c" (UID: "4e6493f5-fe1e-4aab-8b49-866aea075e8c"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:36:56.071471 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:56.071440 2577 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/4e6493f5-fe1e-4aab-8b49-866aea075e8c-oidc-ca\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:36:56.071471 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:56.071467 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrch5\" (UniqueName: \"kubernetes.io/projected/4e6493f5-fe1e-4aab-8b49-866aea075e8c-kube-api-access-hrch5\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:36:56.071623 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:56.071477 2577 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4e6493f5-fe1e-4aab-8b49-866aea075e8c-tls-cert\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:36:56.203092 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:56.203063 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:36:56.208071 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:56.208050 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-dbbb77764-zcs62"] Apr 20 16:36:58.008851 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:36:58.008815 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" path="/var/lib/kubelet/pods/4e6493f5-fe1e-4aab-8b49-866aea075e8c/volumes" Apr 20 16:37:23.748921 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.748883 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:37:23.749368 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.749109 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="cert-manager/cert-manager-759f64656b-nxx69" podUID="927af189-fcc0-4edf-8771-e6af9efdee29" containerName="cert-manager-controller" containerID="cri-o://49e9cc2bb285ce9a7d8aaeb9fcad5f0590d85118bf71d7f9d128d45d5df6033e" gracePeriod=30 Apr 20 16:37:23.988874 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.988841 2577 generic.go:358] "Generic (PLEG): container finished" podID="927af189-fcc0-4edf-8771-e6af9efdee29" containerID="49e9cc2bb285ce9a7d8aaeb9fcad5f0590d85118bf71d7f9d128d45d5df6033e" exitCode=0 Apr 20 16:37:23.988989 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.988893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nxx69" event={"ID":"927af189-fcc0-4edf-8771-e6af9efdee29","Type":"ContainerDied","Data":"49e9cc2bb285ce9a7d8aaeb9fcad5f0590d85118bf71d7f9d128d45d5df6033e"} Apr 20 16:37:23.988989 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.988915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nxx69" event={"ID":"927af189-fcc0-4edf-8771-e6af9efdee29","Type":"ContainerDied","Data":"68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676"} Apr 20 16:37:23.988989 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.988925 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68cb5f7c9e51e4903a0209a790ddb5e21b6cc4cb1adff6a05734555c9e4e8676" Apr 20 16:37:23.998378 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:23.998358 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:37:24.142715 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.142653 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token\") pod \"927af189-fcc0-4edf-8771-e6af9efdee29\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " Apr 20 16:37:24.142881 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.142759 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8swjt\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt\") pod \"927af189-fcc0-4edf-8771-e6af9efdee29\" (UID: \"927af189-fcc0-4edf-8771-e6af9efdee29\") " Apr 20 16:37:24.145112 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.145081 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt" (OuterVolumeSpecName: "kube-api-access-8swjt") pod "927af189-fcc0-4edf-8771-e6af9efdee29" (UID: "927af189-fcc0-4edf-8771-e6af9efdee29"). InnerVolumeSpecName "kube-api-access-8swjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:24.145229 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.145125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "927af189-fcc0-4edf-8771-e6af9efdee29" (UID: "927af189-fcc0-4edf-8771-e6af9efdee29"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:37:24.243574 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.243544 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-bound-sa-token\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:37:24.243574 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.243568 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8swjt\" (UniqueName: \"kubernetes.io/projected/927af189-fcc0-4edf-8771-e6af9efdee29-kube-api-access-8swjt\") on node \"ip-10-0-135-200.ec2.internal\" DevicePath \"\"" Apr 20 16:37:24.992688 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:24.992658 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nxx69" Apr 20 16:37:25.013869 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:25.013838 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:37:25.017364 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:25.017338 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["cert-manager/cert-manager-759f64656b-nxx69"] Apr 20 16:37:26.015625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:26.015592 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927af189-fcc0-4edf-8771-e6af9efdee29" path="/var/lib/kubelet/pods/927af189-fcc0-4edf-8771-e6af9efdee29/volumes" Apr 20 16:37:39.009964 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:39.009936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-856df6f54f-zjt6b_bb074fbc-b45f-40f9-884a-d3a0131b2ac4/manager/0.log" Apr 20 16:37:39.488467 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:39.488431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-zpqll_4602079f-9989-439c-b82f-627276b14013/manager/0.log" Apr 20 16:37:41.142405 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:41.142375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-4sjtv_f0a39ad3-9c1c-45dc-a8f1-23acbd821727/manager/0.log" Apr 20 16:37:41.383023 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:41.382986 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-tfs6s_518a24e2-456f-4d66-af29-2e963b6537d8/registry-server/0.log" Apr 20 16:37:41.521055 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:41.520977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-gmj9k_cea376d4-3d3c-47b7-bfd4-385ea615ab95/manager/0.log" Apr 20 16:37:41.755602 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:41.755570 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-8hwh4_96fdc2c6-3908-4bdd-99db-25e367a265ec/manager/0.log" Apr 20 16:37:42.107795 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:42.107766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg_b09bbc22-938c-4dab-af76-b53bae73362a/istio-proxy/0.log" Apr 20 16:37:43.056288 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.056232 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r_7a93f366-72f5-4596-9633-9bc8ce8c6c54/storage-initializer/0.log" Apr 20 16:37:43.064798 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.064775 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-sf45r_7a93f366-72f5-4596-9633-9bc8ce8c6c54/main/0.log" Apr 20 16:37:43.177254 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.177222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9_969797f2-5913-46e7-af16-654367d780b7/storage-initializer/0.log" Apr 20 16:37:43.185092 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.185066 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-bkvq9_969797f2-5913-46e7-af16-654367d780b7/main/0.log" Apr 20 16:37:43.301877 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.301849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-pqr2x_2a3bd77b-8e17-4737-8b3e-e63ef9b44d69/storage-initializer/0.log" Apr 20 16:37:43.313258 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.313186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-pqr2x_2a3bd77b-8e17-4737-8b3e-e63ef9b44d69/main/0.log" Apr 20 16:37:43.430497 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.430473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm_6b5d78ba-c775-4925-b774-200eb77c8093/main/0.log" Apr 20 16:37:43.439010 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.438975 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcchxzrm_6b5d78ba-c775-4925-b774-200eb77c8093/storage-initializer/0.log" Apr 20 16:37:43.555531 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.555486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc_a813eb2f-e6ce-40fe-8df8-f0ebf785afbb/storage-initializer/0.log" Apr 20 16:37:43.565636 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.565567 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-k4cfc_a813eb2f-e6ce-40fe-8df8-f0ebf785afbb/main/0.log" Apr 20 16:37:43.683724 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.683695 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn_f0082a39-7641-463d-a7fc-199411456274/storage-initializer/0.log" Apr 20 16:37:43.691413 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:43.691390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-c5mrn_f0082a39-7641-463d-a7fc-199411456274/main/0.log" Apr 20 16:37:50.393916 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:50.393890 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9cnb7_99c0543c-05e3-470f-a780-aa8b7b3fca39/global-pull-secret-syncer/0.log" Apr 20 16:37:50.483610 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:50.483577 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-62c6k_fc3d9e83-ce35-494d-b14c-0fe862e9fe53/konnectivity-agent/0.log" Apr 20 16:37:50.583147 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:50.583107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-200.ec2.internal_dd0fda5d43e36cf7513644c2d57d50e3/haproxy/0.log" Apr 20 16:37:55.039225 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:55.039187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-4sjtv_f0a39ad3-9c1c-45dc-a8f1-23acbd821727/manager/0.log" Apr 20 16:37:55.103794 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:55.103363 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-tfs6s_518a24e2-456f-4d66-af29-2e963b6537d8/registry-server/0.log" Apr 20 16:37:55.150475 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:55.150447 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-gmj9k_cea376d4-3d3c-47b7-bfd4-385ea615ab95/manager/0.log" Apr 20 16:37:55.204983 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:55.204945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-8hwh4_96fdc2c6-3908-4bdd-99db-25e367a265ec/manager/0.log" Apr 20 16:37:56.949248 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:56.949217 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-sn85c_20bbc6cc-406c-47b9-b3dc-02361db2b16e/monitoring-plugin/0.log" Apr 20 16:37:57.151191 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.151164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nbjkl_7a5945ad-40e0-4bf0-9712-79d17a1c8d00/node-exporter/0.log" Apr 20 16:37:57.173056 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.173028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nbjkl_7a5945ad-40e0-4bf0-9712-79d17a1c8d00/kube-rbac-proxy/0.log" Apr 20 16:37:57.195252 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.195230 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nbjkl_7a5945ad-40e0-4bf0-9712-79d17a1c8d00/init-textfile/0.log" Apr 20 16:37:57.235447 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.235418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g4x2w_1fdf2c2e-403d-4950-aec2-06e63346304a/kube-rbac-proxy-main/0.log" Apr 20 16:37:57.259825 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.259806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g4x2w_1fdf2c2e-403d-4950-aec2-06e63346304a/kube-rbac-proxy-self/0.log" Apr 20 16:37:57.281505 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.281472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g4x2w_1fdf2c2e-403d-4950-aec2-06e63346304a/openshift-state-metrics/0.log" Apr 20 16:37:57.318886 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.318862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/prometheus/0.log" Apr 20 16:37:57.342361 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.342342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/config-reloader/0.log" Apr 20 16:37:57.370062 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.370043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/thanos-sidecar/0.log" Apr 20 16:37:57.397103 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.397074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/kube-rbac-proxy-web/0.log" Apr 20 16:37:57.419212 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.419155 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/kube-rbac-proxy/0.log" Apr 20 16:37:57.443199 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.443181 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/kube-rbac-proxy-thanos/0.log" Apr 20 16:37:57.464906 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.464886 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f7b4c044-a14a-40b8-ba06-13128cd7878a/init-config-reloader/0.log" Apr 20 16:37:57.652134 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.652105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/thanos-query/0.log" Apr 20 16:37:57.673240 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.673165 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/kube-rbac-proxy-web/0.log" Apr 20 16:37:57.695411 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.695391 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/kube-rbac-proxy/0.log" Apr 20 16:37:57.717099 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.717082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/prom-label-proxy/0.log" Apr 20 16:37:57.741625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.741603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/kube-rbac-proxy-rules/0.log" Apr 20 16:37:57.766604 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:57.766583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7568dc77f9-wmsz7_a6118926-0076-4fd9-aa1f-aa052f9810d1/kube-rbac-proxy-metrics/0.log" Apr 20 16:37:58.741425 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:58.741395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-m7xrq_bc489140-89cc-484b-853e-17ce8319d94f/networking-console-plugin/0.log" Apr 20 16:37:59.107435 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107355 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj"] Apr 20 16:37:59.107796 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107783 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="927af189-fcc0-4edf-8771-e6af9efdee29" containerName="cert-manager-controller" Apr 20 16:37:59.107847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107799 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="927af189-fcc0-4edf-8771-e6af9efdee29" containerName="cert-manager-controller" Apr 20 16:37:59.107847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107816 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" containerName="authorino" Apr 20 16:37:59.107847 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107822 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" containerName="authorino" Apr 20 16:37:59.107938 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107898 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e6493f5-fe1e-4aab-8b49-866aea075e8c" containerName="authorino" Apr 20 16:37:59.107938 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.107907 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="927af189-fcc0-4edf-8771-e6af9efdee29" containerName="cert-manager-controller" Apr 20 16:37:59.111271 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.111255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.113532 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.113502 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"openshift-service-ca.crt\"" Apr 20 16:37:59.113801 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.113787 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"kube-root-ca.crt\"" Apr 20 16:37:59.114840 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.114824 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wg92x\"/\"default-dockercfg-9rm4f\"" Apr 20 16:37:59.119840 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.119816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj"] Apr 20 16:37:59.262993 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.262961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-proc\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.263139 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.263009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-lib-modules\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.263139 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.263055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfq9l\" (UniqueName: \"kubernetes.io/projected/5842a551-c34f-4be2-8413-7ffb834da459-kube-api-access-pfq9l\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.263139 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.263118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-podres\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.263247 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.263175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-sys\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.290131 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.290099 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:37:59.299747 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.299724 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/2.log" Apr 20 16:37:59.364415 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-proc\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364415 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-lib-modules\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfq9l\" (UniqueName: \"kubernetes.io/projected/5842a551-c34f-4be2-8413-7ffb834da459-kube-api-access-pfq9l\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-podres\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-sys\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-proc\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-lib-modules\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364625 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-podres\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.364866 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.364597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5842a551-c34f-4be2-8413-7ffb834da459-sys\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.372945 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.372923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfq9l\" (UniqueName: \"kubernetes.io/projected/5842a551-c34f-4be2-8413-7ffb834da459-kube-api-access-pfq9l\") pod \"perf-node-gather-daemonset-hgmcj\" (UID: \"5842a551-c34f-4be2-8413-7ffb834da459\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.441061 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.441040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:37:59.592522 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.592495 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj"] Apr 20 16:37:59.595153 ip-10-0-135-200 kubenswrapper[2577]: W0420 16:37:59.595125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5842a551_c34f_4be2_8413_7ffb834da459.slice/crio-0cfc0e995dd984c3eb2c3fbeaa972d07c998b315685cc81bb8ab829c090e2c0d WatchSource:0}: Error finding container 0cfc0e995dd984c3eb2c3fbeaa972d07c998b315685cc81bb8ab829c090e2c0d: Status 404 returned error can't find the container with id 0cfc0e995dd984c3eb2c3fbeaa972d07c998b315685cc81bb8ab829c090e2c0d Apr 20 16:37:59.820884 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:37:59.820856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d58b87c9b-lbxmv_959a25ec-33d1-43da-883f-75b832181add/console/0.log" Apr 20 16:38:00.120140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:00.120046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" event={"ID":"5842a551-c34f-4be2-8413-7ffb834da459","Type":"ContainerStarted","Data":"76f0e9a539f572d37d6fb54e6e2dc9d552d1f6c92e893a2cdbf5a0be7f31a5b0"} Apr 20 16:38:00.120140 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:00.120086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" event={"ID":"5842a551-c34f-4be2-8413-7ffb834da459","Type":"ContainerStarted","Data":"0cfc0e995dd984c3eb2c3fbeaa972d07c998b315685cc81bb8ab829c090e2c0d"} Apr 20 16:38:00.120328 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:00.120185 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:38:00.138543 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:00.138497 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" podStartSLOduration=1.13848324 podStartE2EDuration="1.13848324s" podCreationTimestamp="2026-04-20 16:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:38:00.136973059 +0000 UTC m=+884.716543096" watchObservedRunningTime="2026-04-20 16:38:00.13848324 +0000 UTC m=+884.718053254" Apr 20 16:38:01.182900 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:01.182864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7j9p9_f87ef63b-de21-49e4-89ee-c732444e83a3/dns/0.log" Apr 20 16:38:01.203988 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:01.203960 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7j9p9_f87ef63b-de21-49e4-89ee-c732444e83a3/kube-rbac-proxy/0.log" Apr 20 16:38:01.348107 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:01.348081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pzfs2_e8733069-fadf-4af4-a36d-4e7f085cc317/dns-node-resolver/0.log" Apr 20 16:38:01.825758 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:01.825726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5cbcb584c5-w4vng_13d1688c-f90d-4062-b1af-16dc32e62dba/registry/0.log" Apr 20 16:38:01.871170 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:01.871141 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jw965_a8c37028-00dc-4ae4-9e33-7af134c543da/node-ca/0.log" Apr 20 16:38:02.701103 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:02.701070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvb5cg_b09bbc22-938c-4dab-af76-b53bae73362a/istio-proxy/0.log" Apr 20 16:38:03.458341 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:03.458308 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xqtvp_66ddeaa0-b37e-4d1b-8043-b74a8bb883a8/serve-healthcheck-canary/0.log" Apr 20 16:38:03.918615 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:03.918588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k72wm_1380f1d3-60b6-4093-a429-bd909b4729e8/insights-operator/1.log" Apr 20 16:38:03.919109 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:03.919093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k72wm_1380f1d3-60b6-4093-a429-bd909b4729e8/insights-operator/0.log" Apr 20 16:38:03.955338 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:03.955308 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6b6w7_cd824f8e-bc79-41e3-afab-13765cfa09ae/kube-rbac-proxy/0.log" Apr 20 16:38:03.996155 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:03.996121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6b6w7_cd824f8e-bc79-41e3-afab-13765cfa09ae/exporter/0.log" Apr 20 16:38:04.038224 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:04.038200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6b6w7_cd824f8e-bc79-41e3-afab-13765cfa09ae/extractor/0.log" Apr 20 16:38:06.133878 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:06.133853 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-hgmcj" Apr 20 16:38:06.194178 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:06.194148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-856df6f54f-zjt6b_bb074fbc-b45f-40f9-884a-d3a0131b2ac4/manager/0.log" Apr 20 16:38:06.339615 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:06.339577 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-zpqll_4602079f-9989-439c-b82f-627276b14013/manager/0.log" Apr 20 16:38:07.678205 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:07.678166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-d6fdb785c-g6pw5_0ed8847c-5faf-4b2b-bb88-57e8bff71d38/manager/0.log" Apr 20 16:38:12.196289 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:12.196260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9r2xm_88fe29fc-0334-4909-aee2-527d4cd4b89e/migrator/0.log" Apr 20 16:38:12.222205 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:12.222182 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9r2xm_88fe29fc-0334-4909-aee2-527d4cd4b89e/graceful-termination/0.log" Apr 20 16:38:13.570399 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.570361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7rgb7_65d069e3-fe1b-4b3c-a5c6-f5a1c7d1a50c/kube-multus/0.log" Apr 20 16:38:13.649722 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.649672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/kube-multus-additional-cni-plugins/0.log" Apr 20 16:38:13.670996 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.670973 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/egress-router-binary-copy/0.log" Apr 20 16:38:13.698548 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.698527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/cni-plugins/0.log" Apr 20 16:38:13.721528 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.721505 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/bond-cni-plugin/0.log" Apr 20 16:38:13.744343 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.744322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/routeoverride-cni/0.log" Apr 20 16:38:13.767073 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.767053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/whereabouts-cni-bincopy/0.log" Apr 20 16:38:13.788772 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:13.788754 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fjxjp_7be8427f-1eea-4919-adcf-00cd843532e2/whereabouts-cni/0.log" Apr 20 16:38:14.200605 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:14.200580 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mpvsq_ec42a0e4-ff1e-48d5-8b45-fab851d223a4/network-metrics-daemon/0.log" Apr 20 16:38:14.219486 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:14.219462 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mpvsq_ec42a0e4-ff1e-48d5-8b45-fab851d223a4/kube-rbac-proxy/0.log" Apr 20 16:38:15.444395 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.444351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-controller/0.log" Apr 20 16:38:15.462361 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.462333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:38:15.470715 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.470657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/1.log" Apr 20 16:38:15.494717 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.494696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/kube-rbac-proxy-node/0.log" Apr 20 16:38:15.518377 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.518353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 16:38:15.536394 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.536367 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/northd/0.log" Apr 20 16:38:15.558116 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.558096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/nbdb/0.log" Apr 20 16:38:15.579067 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.579046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/sbdb/0.log" Apr 20 16:38:15.780971 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.780896 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovnkube-controller/0.log" Apr 20 16:38:15.944465 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.944439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:38:15.945772 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.945751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xv6j6_1ddd4906-e010-4e9e-89d5-6017138ff6a9/console-operator/1.log" Apr 20 16:38:15.947104 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.947082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:38:15.948347 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.948330 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8s5n_039d415e-4ed7-4e94-8a34-f5f605b30b1d/ovn-acl-logging/0.log" Apr 20 16:38:15.990990 ip-10-0-135-200 kubenswrapper[2577]: I0420 16:38:15.990967 2577 scope.go:117] "RemoveContainer" containerID="49e9cc2bb285ce9a7d8aaeb9fcad5f0590d85118bf71d7f9d128d45d5df6033e"