Apr 17 16:49:55.957595 ip-10-0-132-199 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:49:55.957606 ip-10-0-132-199 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:49:55.957613 ip-10-0-132-199 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:49:55.957817 ip-10-0-132-199 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:50:06.049985 ip-10-0-132-199 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:50:06.049999 ip-10-0-132-199 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4fe118384ecd425d94c1eae5da4e8c2a -- Apr 17 16:52:28.988820 ip-10-0-132-199 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:52:29.356651 ip-10-0-132-199 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:29.356651 ip-10-0-132-199 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:52:29.356651 ip-10-0-132-199 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:29.356651 ip-10-0-132-199 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:52:29.356651 ip-10-0-132-199 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:29.357984 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.357903 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:52:29.360825 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360811 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:29.360825 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360825 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360829 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360832 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360836 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360839 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360842 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360845 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360848 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360851 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360854 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360857 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360860 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360862 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360865 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360868 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360871 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360873 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360876 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360878 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360881 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:29.360892 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360884 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360887 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360889 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360892 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360894 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360897 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360900 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360903 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360906 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360909 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360912 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360914 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360917 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360919 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360922 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360926 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360928 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360931 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360933 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360936 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:29.361361 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360938 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360940 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360943 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360945 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360948 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360950 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360952 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360955 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360958 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360960 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360962 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360965 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360967 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360970 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360973 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360978 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360981 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360985 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360987 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:29.361863 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360990 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360994 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.360997 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361000 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361003 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361008 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361011 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361014 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361017 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361019 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361022 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361024 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361027 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361029 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361032 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361035 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361037 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361040 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361042 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:29.362319 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361045 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361047 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361050 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361052 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361055 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361057 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361060 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361418 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361424 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361427 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361430 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361432 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361435 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361438 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361440 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361443 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361446 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361449 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361452 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361454 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:29.362791 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361457 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361459 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361462 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361464 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361467 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361469 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361472 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361474 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361476 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361479 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361482 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361484 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361487 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361490 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361492 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361495 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361497 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361500 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361503 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361505 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:29.363293 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361508 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361511 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361514 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361517 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361519 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361522 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361524 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361527 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361529 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361532 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361535 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361537 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361539 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361542 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361544 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361547 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361549 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361551 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361554 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361556 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:29.363801 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361559 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361562 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361564 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361566 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361570 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361572 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361575 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361577 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361581 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361584 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361588 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361608 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361611 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361615 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361619 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361622 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361625 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361627 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361630 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:29.364290 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361633 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361635 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361639 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361642 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361645 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361648 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361651 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361653 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361656 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361658 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361661 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361663 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361666 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.361668 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362825 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362837 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362844 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362849 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362853 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362857 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362862 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:52:29.364770 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362866 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362869 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362872 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362876 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362879 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362882 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362885 2575 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362888 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362891 2575 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362894 2575 flags.go:64] FLAG: --cloud-config="" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362896 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362899 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362910 2575 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362914 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362917 2575 flags.go:64] FLAG: --config-dir="" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362920 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362924 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362931 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362935 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362938 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362941 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362944 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362947 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362949 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362953 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:52:29.365281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362955 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362960 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362963 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362965 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362969 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362972 2575 flags.go:64] FLAG: --enable-server="true" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362975 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362979 2575 flags.go:64] FLAG: --event-burst="100" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362982 2575 flags.go:64] FLAG: --event-qps="50" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362986 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362989 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362992 2575 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362996 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.362999 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363002 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363005 2575 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363007 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363010 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363013 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363016 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363019 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363022 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363025 2575 flags.go:64] FLAG: --feature-gates="" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363028 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363032 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:52:29.365892 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363035 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363038 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363041 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363044 2575 flags.go:64] FLAG: --help="false" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363047 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363050 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363053 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363055 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363059 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363062 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363065 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363067 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363070 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363073 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363076 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363079 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363081 2575 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363084 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363088 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363091 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363094 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363097 2575 flags.go:64] FLAG: --lock-file="" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363100 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363103 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:52:29.366493 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363106 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363112 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363115 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363117 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363120 2575 flags.go:64] FLAG: --logging-format="text" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363123 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363126 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363129 2575 flags.go:64] FLAG: --manifest-url="" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363132 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363137 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363140 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363144 2575 flags.go:64] FLAG: --max-pods="110" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363147 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363150 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363154 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363157 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363160 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363163 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363166 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363173 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363176 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363179 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363182 2575 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:52:29.367092 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363185 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363191 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363194 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363197 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363200 2575 flags.go:64] FLAG: --port="10250" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363203 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363206 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08040fce9507ee340" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363209 2575 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363212 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363215 2575 flags.go:64] FLAG: --register-node="true" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363219 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363222 2575 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363225 2575 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363228 2575 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363231 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363234 2575 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363238 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363241 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363244 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363247 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363250 2575 flags.go:64] FLAG: --runonce="false" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363253 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363256 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363259 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363262 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363264 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:52:29.367666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363267 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363271 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363274 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363276 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363279 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363282 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363285 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363288 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363291 2575 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363293 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363299 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363302 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363305 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363310 2575 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363313 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363316 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363319 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363322 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363325 2575 flags.go:64] FLAG: --v="2" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363329 2575 flags.go:64] FLAG: --version="false" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363333 2575 flags.go:64] FLAG: --vmodule="" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363337 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.363341 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363429 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:29.368334 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363433 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363436 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363440 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363442 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363445 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363447 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363450 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363452 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363455 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363457 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363460 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363464 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363467 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363470 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363473 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363476 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363479 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363482 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363489 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:29.369002 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363492 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363495 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363498 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363501 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363504 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363507 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363509 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363512 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363515 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363517 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363520 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363522 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363525 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363528 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363530 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363533 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363536 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363539 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363541 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363544 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:29.369500 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363546 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363549 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363551 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363554 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363556 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363558 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363561 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363564 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363566 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363568 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363571 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363576 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363580 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363582 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363585 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363587 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363602 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363606 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363609 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:29.370012 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363611 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363614 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363617 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363619 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363622 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363625 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363628 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363630 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363633 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363637 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363640 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363643 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363645 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363648 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363651 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363653 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363656 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363658 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363661 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:29.370475 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363663 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363666 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363668 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363671 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363673 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363677 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363680 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.363682 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.364276 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.370767 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.370785 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370834 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370840 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370844 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370847 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370852 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:29.370943 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370857 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370861 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370866 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370868 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370871 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370874 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370878 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370880 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370883 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370885 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370888 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370891 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370893 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370896 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370898 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370901 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370904 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370907 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370909 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:29.371377 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370913 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370917 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370921 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370925 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370928 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370933 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370938 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370944 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370948 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370951 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370954 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370957 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370959 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370963 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370965 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370968 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370970 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370973 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370975 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:29.371854 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370978 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370982 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370984 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370987 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370989 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370992 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370994 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370997 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.370999 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371002 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371005 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371007 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371011 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371015 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371019 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371023 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371027 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371030 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371032 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371035 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:29.372327 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371039 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371041 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371044 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371046 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371049 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371051 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371054 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371056 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371059 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371061 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371063 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371066 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371069 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371071 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371078 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371081 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371083 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371086 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371088 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371092 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371096 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:29.372821 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371100 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371104 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.371111 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371210 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371215 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371218 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371220 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371223 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371226 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371228 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371231 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371234 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371237 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371240 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371243 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:29.373325 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371245 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371247 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371250 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371254 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371258 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371262 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371266 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371270 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371272 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371275 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371278 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371281 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371283 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371287 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371291 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371293 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371296 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371298 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371301 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371303 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:29.373710 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371306 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371308 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371311 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371314 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371316 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371319 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371321 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371323 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371326 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371329 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371332 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371336 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371340 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371344 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371349 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371353 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371355 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371357 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371360 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:29.374191 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371362 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371365 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371367 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371370 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371372 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371375 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371379 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371382 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371384 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371387 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371389 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371392 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371395 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371397 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371399 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371402 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371404 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371407 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371409 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371411 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:29.374699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371415 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371419 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371425 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371431 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371434 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371437 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371440 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371443 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371445 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371448 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371450 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371453 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371455 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371457 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:29.371460 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:29.375186 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.371465 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:29.375575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.372124 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:52:29.375575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.374196 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:52:29.375575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.375146 2575 server.go:1019] "Starting client certificate rotation" Apr 17 16:52:29.375575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.375238 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:29.375973 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.375956 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:29.400901 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.400878 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:29.404320 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.404301 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:29.416799 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.416779 2575 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:52:29.422283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.422268 2575 log.go:25] "Validated CRI v1 image API" Apr 17 16:52:29.423448 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.423433 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:52:29.427113 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.427093 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 f50ae135-63c4-408d-98b3-496dcff73c5f:/dev/nvme0n1p3 f851cefe-93e6-4e06-9f7c-bb5cb97071dd:/dev/nvme0n1p4] Apr 17 16:52:29.427188 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.427112 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:52:29.432600 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.432488 2575 manager.go:217] Machine: {Timestamp:2026-04-17 16:52:29.430691951 +0000 UTC m=+0.339419039 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107847 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e3c0db85661c686c57043554f3cdc SystemUUID:ec2e3c0d-b856-61c6-86c5-7043554f3cdc BootID:4fe11838-4ecd-425d-94c1-eae5da4e8c2a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1b:5b:77:cc:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1b:5b:77:cc:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:cb:55:41:74:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:52:29.432600 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.432586 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:52:29.432702 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.432695 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:52:29.433633 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.433610 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:52:29.433772 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.433636 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-199.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:52:29.433817 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.433781 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:52:29.433817 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.433789 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:52:29.433817 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.433802 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:29.434520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.434510 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:29.436017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.436005 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:29.436244 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.436235 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:52:29.438970 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.438961 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:52:29.439011 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.438975 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:52:29.439011 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.438993 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:52:29.439011 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.439002 2575 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:52:29.439128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.439014 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:52:29.440024 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.440014 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:29.440062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.440031 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:29.442748 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.442720 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:29.443512 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.443491 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:52:29.445509 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.445492 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:52:29.447297 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447285 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447303 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447310 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447315 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447320 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447328 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447336 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:52:29.447344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447345 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:52:29.447518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447354 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:52:29.447518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447359 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:52:29.447518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447368 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:52:29.447518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.447377 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:52:29.448142 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.448130 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:52:29.448182 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.448143 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:52:29.451680 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.451669 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:52:29.451717 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.451702 2575 server.go:1295] "Started kubelet" Apr 17 16:52:29.451823 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.451794 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:52:29.451948 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.451791 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:52:29.452014 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.451971 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:52:29.452532 ip-10-0-132-199 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:52:29.453072 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.452445 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-199.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:52:29.453205 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.453174 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:52:29.454329 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.454315 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:52:29.454683 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.454656 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:52:29.454764 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.454721 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:52:29.457923 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.457883 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:29.458361 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.458342 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:52:29.458946 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.458926 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:52:29.458946 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.458945 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:52:29.459083 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.458941 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:52:29.459083 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459040 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:52:29.459159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459107 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:52:29.459159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459121 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:52:29.459159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459108 2575 factory.go:55] Registering systemd factory Apr 17 16:52:29.459159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459155 2575 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:52:29.459375 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459358 2575 factory.go:153] Registering CRI-O factory Apr 17 16:52:29.459433 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.459354 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.459433 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459381 2575 factory.go:223] Registration of the crio container factory successfully Apr 17 16:52:29.459523 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459492 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:52:29.459642 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459630 2575 factory.go:103] Registering Raw factory Apr 17 16:52:29.459708 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.459649 2575 manager.go:1196] Started watching for new ooms in manager Apr 17 16:52:29.460354 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.460339 2575 manager.go:319] Starting recovery of all containers Apr 17 16:52:29.461160 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.461134 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:52:29.461750 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.461721 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:52:29.462186 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.461284 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-199.ec2.internal.18a733166e2cdb5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-199.ec2.internal,UID:ip-10-0-132-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-199.ec2.internal,},FirstTimestamp:2026-04-17 16:52:29.451680605 +0000 UTC m=+0.360407690,LastTimestamp:2026-04-17 16:52:29.451680605 +0000 UTC m=+0.360407690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-199.ec2.internal,}" Apr 17 16:52:29.469974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.469959 2575 manager.go:324] Recovery completed Apr 17 16:52:29.474378 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.474361 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.478159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478142 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.478225 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478168 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.478225 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478178 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.478689 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478675 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:52:29.478689 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478689 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:52:29.478758 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.478703 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:29.480304 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.480246 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-199.ec2.internal.18a733166fc0d97a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-199.ec2.internal,UID:ip-10-0-132-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-199.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-199.ec2.internal,},FirstTimestamp:2026-04-17 16:52:29.478156666 +0000 UTC m=+0.386883748,LastTimestamp:2026-04-17 16:52:29.478156666 +0000 UTC m=+0.386883748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-199.ec2.internal,}" Apr 17 16:52:29.482204 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.482192 2575 policy_none.go:49] "None policy: Start" Apr 17 16:52:29.482246 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.482209 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:52:29.482246 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.482218 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:52:29.495955 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.495890 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-199.ec2.internal.18a733166fc11a49 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-199.ec2.internal,UID:ip-10-0-132-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-132-199.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-132-199.ec2.internal,},FirstTimestamp:2026-04-17 16:52:29.478173257 +0000 UTC m=+0.386900340,LastTimestamp:2026-04-17 16:52:29.478173257 +0000 UTC m=+0.386900340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-199.ec2.internal,}" Apr 17 16:52:29.504309 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.504290 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrplh" Apr 17 16:52:29.506606 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.506540 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-199.ec2.internal.18a733166fc13ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-199.ec2.internal,UID:ip-10-0-132-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-132-199.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-132-199.ec2.internal,},FirstTimestamp:2026-04-17 16:52:29.478182631 +0000 UTC m=+0.386909715,LastTimestamp:2026-04-17 16:52:29.478182631 +0000 UTC m=+0.386909715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-199.ec2.internal,}" Apr 17 16:52:29.512253 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.512235 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrplh" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523175 2575 manager.go:341] "Starting Device Plugin manager" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.523198 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523208 2575 server.go:85] "Starting device plugin registration server" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523413 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523426 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523521 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523607 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.523617 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.524110 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:52:29.539210 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.524144 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.583642 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.583613 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:52:29.584742 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.584725 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:52:29.584826 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.584751 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:52:29.584826 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.584770 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:52:29.584826 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.584776 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:52:29.584826 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.584805 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:52:29.587156 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.587139 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:29.624277 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.624224 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.628081 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.628064 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.628150 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.628092 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.628150 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.628102 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.628150 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.628127 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.641425 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.641406 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.641480 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.641429 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-199.ec2.internal\": node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.662604 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.662580 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.685442 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.685422 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal"] Apr 17 16:52:29.685496 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.685478 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.686232 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.686218 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.686292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.686245 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.686292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.686282 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.688683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.688670 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.688834 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.688822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.688870 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.688850 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689319 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689348 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689351 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689380 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689432 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.689462 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.689463 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.691624 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.691602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.691624 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.691628 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:29.692294 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.692279 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:29.692370 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.692310 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:29.692370 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.692323 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:29.712170 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.712151 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-199.ec2.internal\" not found" node="ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.715461 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.715447 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-199.ec2.internal\" not found" node="ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.760959 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.760938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.761062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.760964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.761062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.760980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12719cc175836885556f66e7ce6f8019-config\") pod \"kube-apiserver-proxy-ip-10-0-132-199.ec2.internal\" (UID: \"12719cc175836885556f66e7ce6f8019\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.763031 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.763016 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.861181 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.861279 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.861279 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12719cc175836885556f66e7ce6f8019-config\") pod \"kube-apiserver-proxy-ip-10-0-132-199.ec2.internal\" (UID: \"12719cc175836885556f66e7ce6f8019\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.861279 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/12719cc175836885556f66e7ce6f8019-config\") pod \"kube-apiserver-proxy-ip-10-0-132-199.ec2.internal\" (UID: \"12719cc175836885556f66e7ce6f8019\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.861279 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.861279 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:29.861255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1805319a9755b3230cda0406644f8c58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal\" (UID: \"1805319a9755b3230cda0406644f8c58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:29.863267 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.863255 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:29.964081 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:29.964030 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.014225 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.014195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:30.017837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.017813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:30.064716 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.064685 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.165227 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.165201 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.265810 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.265744 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.366357 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.366329 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.375484 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.375470 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:52:30.375627 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.375610 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:52:30.457979 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.457956 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:30.466867 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.466848 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.475915 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.475899 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:30.513294 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.513271 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2xq84" Apr 17 16:52:30.514351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.514318 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:47:29 +0000 UTC" deadline="2028-01-23 04:13:29.879072518 +0000 UTC" Apr 17 16:52:30.514409 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.514352 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15491h20m59.36472345s" Apr 17 16:52:30.522681 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.522640 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2xq84" Apr 17 16:52:30.567635 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.567613 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.668404 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.668372 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.722278 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:30.722242 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12719cc175836885556f66e7ce6f8019.slice/crio-d2ff513ca520fb60f31ee6be6e43d58fcc0b1aef7c16f2645886b6c2f1cc02aa WatchSource:0}: Error finding container d2ff513ca520fb60f31ee6be6e43d58fcc0b1aef7c16f2645886b6c2f1cc02aa: Status 404 returned error can't find the container with id d2ff513ca520fb60f31ee6be6e43d58fcc0b1aef7c16f2645886b6c2f1cc02aa Apr 17 16:52:30.723090 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:30.723068 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1805319a9755b3230cda0406644f8c58.slice/crio-8bd41d95f9274fae14faad0e23dc7b2601335b97891369f32d69ae728baa5142 WatchSource:0}: Error finding container 8bd41d95f9274fae14faad0e23dc7b2601335b97891369f32d69ae728baa5142: Status 404 returned error can't find the container with id 8bd41d95f9274fae14faad0e23dc7b2601335b97891369f32d69ae728baa5142 Apr 17 16:52:30.727080 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.727058 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:52:30.769264 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.769242 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.793217 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.793170 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:30.869701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:30.869681 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-199.ec2.internal\" not found" Apr 17 16:52:30.938615 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.938583 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:30.958966 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.958952 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" Apr 17 16:52:30.964448 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.964422 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:30.972079 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.972064 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:30.972873 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.972861 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" Apr 17 16:52:30.981509 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:30.981493 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:31.440676 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.440648 2575 apiserver.go:52] "Watching apiserver" Apr 17 16:52:31.448404 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.448381 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:52:31.448846 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.448812 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-pckwb","kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w","openshift-cluster-node-tuning-operator/tuned-g4mwl","openshift-image-registry/node-ca-7br24","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal","openshift-dns/node-resolver-mbbjn","openshift-multus/multus-4wnr7","openshift-multus/multus-additional-cni-plugins-78nlz","openshift-multus/network-metrics-daemon-pm56t","openshift-network-diagnostics/network-check-target-fhjz7","openshift-network-operator/iptables-alerter-vcl9m","openshift-ovn-kubernetes/ovnkube-node-r8dsh"] Apr 17 16:52:31.451849 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.451825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.454239 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.454220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.456827 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.456805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.456922 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.456863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.457014 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.456991 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6mljv\"" Apr 17 16:52:31.458306 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.458263 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.458599 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.458574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.459210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.459190 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:52:31.459298 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.459197 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66wr2\"" Apr 17 16:52:31.459475 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.459458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.461210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.461190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.461336 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.461318 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.461404 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.461326 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gd9q2\"" Apr 17 16:52:31.462002 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.461976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.462098 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.462064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.464253 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.464236 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wq4nq\"" Apr 17 16:52:31.464652 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.464506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:52:31.464652 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.464505 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.465059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.465029 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8kn9t\"" Apr 17 16:52:31.465243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.465174 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:52:31.465243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.465220 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.465372 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.465256 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:52:31.465372 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.465306 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:52:31.466908 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.466668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.466908 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.466669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.467041 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.466916 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:31.469125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.469010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:52:31.469341 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.469323 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:52:31.469431 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.469323 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7dgfk\"" Apr 17 16:52:31.470636 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-device-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.470636 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.470636 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-sys\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e1eaec4-d649-4750-ab80-f763a4edde6a-hosts-file\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-kubernetes\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xg76\" (UniqueName: \"kubernetes.io/projected/1d2fa510-3384-4880-a957-6404588186c2-kube-api-access-9xg76\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-cnibin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-bin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e1eaec4-d649-4750-ab80-f763a4edde6a-tmp-dir\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.470837 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-sys-fs\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-modprobe-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-conf-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfs6\" (UniqueName: \"kubernetes.io/projected/842dbf17-4840-4948-b464-a2890415d77a-kube-api-access-xpfs6\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-socket-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysconfig\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-lib-modules\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-os-release\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.470995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-k8s-cni-cncf-io\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.471040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-hostroot\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.471063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-etc-kubernetes\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.471127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tqr\" (UniqueName: \"kubernetes.io/projected/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kube-api-access-58tqr\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.471166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:31.471260 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.471166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-run\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-kubelet\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-systemd\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b961c80-2452-4eb9-ba7f-bd5743b6253f-agent-certs\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b961c80-2452-4eb9-ba7f-bd5743b6253f-konnectivity-ca\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-cni-binary-copy\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-netns\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-multus\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472540 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-multus-certs\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472953 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-etc-selinux\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.472953 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-tmp\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.472953 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-system-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472953 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.472667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-socket-dir-parent\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.472953 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.471516 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:31.473445 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-multus-daemon-config\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.473445 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7ng\" (UniqueName: \"kubernetes.io/projected/8e1eaec4-d649-4750-ab80-f763a4edde6a-kube-api-access-qq7ng\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.473445 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-registration-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.473445 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-conf\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.473725 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-var-lib-kubelet\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.473725 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-host\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.473725 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.473515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-etc-tuned\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.475651 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.475183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.475651 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.475632 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:31.478494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.477944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.478494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.478056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.479666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.479002 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.479666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.479021 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.479666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.479489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-54nsz\"" Apr 17 16:52:31.479666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.479564 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:52:31.480507 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.480489 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:52:31.481058 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.481033 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:52:31.481343 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.481322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.481517 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.481501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.481740 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.481719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7nk\"" Apr 17 16:52:31.481860 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.481839 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:52:31.482672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.482271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:52:31.482672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.482325 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:52:31.482672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.482352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z9bch\"" Apr 17 16:52:31.482672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.482444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:52:31.482672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.482632 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:52:31.524962 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.524940 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:30 +0000 UTC" deadline="2027-10-01 03:21:37.686236092 +0000 UTC" Apr 17 16:52:31.524962 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.524961 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12754h29m6.161278137s" Apr 17 16:52:31.560921 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.560901 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:52:31.574081 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.574195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e370d23-be74-44de-9ef0-318adb824e76-host\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.574195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-systemd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.574195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e370d23-be74-44de-9ef0-318adb824e76-serviceca\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.574351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-sys\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.574351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-os-release\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.574351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-sys\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.574351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.574514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-bin\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.574514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-config\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.574514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovn-node-metrics-cert\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.574514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.574514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xg76\" (UniqueName: \"kubernetes.io/projected/1d2fa510-3384-4880-a957-6404588186c2-kube-api-access-9xg76\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-bin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-bin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e1eaec4-d649-4750-ab80-f763a4edde6a-tmp-dir\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-slash\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.574731 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-etc-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-sys-fs\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-modprobe-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-systemd-units\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-sys-fs\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-script-lib\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb8z9\" (UniqueName: \"kubernetes.io/projected/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-kube-api-access-gb8z9\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e1eaec4-d649-4750-ab80-f763a4edde6a-tmp-dir\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-socket-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-modprobe-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysconfig\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-os-release\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-socket-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.574995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-etc-kubernetes\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575017 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysconfig\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-etc-kubernetes\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-os-release\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-run\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-kubelet\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-run\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-kubelet\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmbkg\" (UniqueName: \"kubernetes.io/projected/6e370d23-be74-44de-9ef0-318adb824e76-kube-api-access-bmbkg\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-node-log\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-env-overrides\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/330a6d22-074b-44ae-8b4f-50637cd24561-iptables-alerter-script\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b961c80-2452-4eb9-ba7f-bd5743b6253f-konnectivity-ca\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-cni-binary-copy\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-netns\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-multus\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.575503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-log-socket\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-netns\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-var-lib-cni-multus\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777kw\" (UniqueName: \"kubernetes.io/projected/6064206c-e379-4668-9aa8-a2165341d497-kube-api-access-777kw\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-etc-selinux\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-system-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-socket-dir-parent\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-multus-daemon-config\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-etc-selinux\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-system-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-var-lib-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-ovn\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-socket-dir-parent\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-registration-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-host\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-etc-tuned\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576171 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b961c80-2452-4eb9-ba7f-bd5743b6253f-konnectivity-ca\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-host\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-cni-binary-copy\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-registration-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.575938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-device-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-device-dir\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e1eaec4-d649-4750-ab80-f763a4edde6a-hosts-file\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/842dbf17-4840-4948-b464-a2890415d77a-multus-daemon-config\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-cni-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-d\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576189 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-netns\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e1eaec4-d649-4750-ab80-f763a4edde6a-hosts-file\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-kubernetes\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-cnibin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-cnibin\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.576845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-kubernetes\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-conf-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfs6\" (UniqueName: \"kubernetes.io/projected/842dbf17-4840-4948-b464-a2890415d77a-kube-api-access-xpfs6\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-multus-conf-dir\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cnibin\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-lib-modules\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-k8s-cni-cncf-io\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-hostroot\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-kubelet\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-netd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-lib-modules\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tqr\" (UniqueName: \"kubernetes.io/projected/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kube-api-access-58tqr\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-hostroot\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-system-cni-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhzj\" (UniqueName: \"kubernetes.io/projected/330a6d22-074b-44ae-8b4f-50637cd24561-kube-api-access-vdhzj\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-k8s-cni-cncf-io\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.577529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-systemd\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b961c80-2452-4eb9-ba7f-bd5743b6253f-agent-certs\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-multus-certs\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-systemd\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/842dbf17-4840-4948-b464-a2890415d77a-host-run-multus-certs\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/330a6d22-074b-44ae-8b4f-50637cd24561-host-slash\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-tmp\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.576970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7ng\" (UniqueName: \"kubernetes.io/projected/8e1eaec4-d649-4750-ab80-f763a4edde6a-kube-api-access-qq7ng\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9hn\" (UniqueName: \"kubernetes.io/projected/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-kube-api-access-8g9hn\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-conf\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-var-lib-kubelet\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-binary-copy\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-etc-sysctl-conf\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578036 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.577190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d2fa510-3384-4880-a957-6404588186c2-var-lib-kubelet\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.578894 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.578876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-etc-tuned\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.579316 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.579301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d2fa510-3384-4880-a957-6404588186c2-tmp\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.579645 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.579628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b961c80-2452-4eb9-ba7f-bd5743b6253f-agent-certs\") pod \"konnectivity-agent-pckwb\" (UID: \"3b961c80-2452-4eb9-ba7f-bd5743b6253f\") " pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.583666 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.583635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xg76\" (UniqueName: \"kubernetes.io/projected/1d2fa510-3384-4880-a957-6404588186c2-kube-api-access-9xg76\") pod \"tuned-g4mwl\" (UID: \"1d2fa510-3384-4880-a957-6404588186c2\") " pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.584403 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.584380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tqr\" (UniqueName: \"kubernetes.io/projected/f3ac56b0-5d8a-47e5-a66b-12a3994a9837-kube-api-access-58tqr\") pod \"aws-ebs-csi-driver-node-shr6w\" (UID: \"f3ac56b0-5d8a-47e5-a66b-12a3994a9837\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.584753 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.584731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfs6\" (UniqueName: \"kubernetes.io/projected/842dbf17-4840-4948-b464-a2890415d77a-kube-api-access-xpfs6\") pod \"multus-4wnr7\" (UID: \"842dbf17-4840-4948-b464-a2890415d77a\") " pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.585098 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.585079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7ng\" (UniqueName: \"kubernetes.io/projected/8e1eaec4-d649-4750-ab80-f763a4edde6a-kube-api-access-qq7ng\") pod \"node-resolver-mbbjn\" (UID: \"8e1eaec4-d649-4750-ab80-f763a4edde6a\") " pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.589141 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.589100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" event={"ID":"1805319a9755b3230cda0406644f8c58","Type":"ContainerStarted","Data":"8bd41d95f9274fae14faad0e23dc7b2601335b97891369f32d69ae728baa5142"} Apr 17 16:52:31.589935 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.589918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" event={"ID":"12719cc175836885556f66e7ce6f8019","Type":"ContainerStarted","Data":"d2ff513ca520fb60f31ee6be6e43d58fcc0b1aef7c16f2645886b6c2f1cc02aa"} Apr 17 16:52:31.677550 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.677704 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-bin\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677704 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-config\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677704 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-bin\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovn-node-metrics-cert\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-slash\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-etc-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-systemd-units\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-script-lib\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb8z9\" (UniqueName: \"kubernetes.io/projected/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-kube-api-access-gb8z9\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.677878 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-slash\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmbkg\" (UniqueName: \"kubernetes.io/projected/6e370d23-be74-44de-9ef0-318adb824e76-kube-api-access-bmbkg\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-node-log\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-env-overrides\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/330a6d22-074b-44ae-8b4f-50637cd24561-iptables-alerter-script\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-log-socket\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-log-socket\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-systemd-units\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.677893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-etc-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-config\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-777kw\" (UniqueName: \"kubernetes.io/projected/6064206c-e379-4668-9aa8-a2165341d497-kube-api-access-777kw\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.678243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-node-log\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-var-lib-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.678317 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-ovn\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.678397 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:32.1783769 +0000 UTC m=+3.087103998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-ovn\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovnkube-script-lib\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-var-lib-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-netns\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cnibin\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-netns\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cnibin\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-kubelet\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-netd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.678974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-kubelet\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-env-overrides\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-system-cni-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhzj\" (UniqueName: \"kubernetes.io/projected/330a6d22-074b-44ae-8b4f-50637cd24561-kube-api-access-vdhzj\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-host-cni-netd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-system-cni-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/330a6d22-074b-44ae-8b4f-50637cd24561-iptables-alerter-script\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/330a6d22-074b-44ae-8b4f-50637cd24561-host-slash\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9hn\" (UniqueName: \"kubernetes.io/projected/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-kube-api-access-8g9hn\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-openvswitch\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/330a6d22-074b-44ae-8b4f-50637cd24561-host-slash\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-binary-copy\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.678998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e370d23-be74-44de-9ef0-318adb824e76-host\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-systemd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e370d23-be74-44de-9ef0-318adb824e76-serviceca\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.679750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-os-release\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-os-release\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-run-systemd\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e370d23-be74-44de-9ef0-318adb824e76-host\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-cni-binary-copy\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.680537 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.679549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e370d23-be74-44de-9ef0-318adb824e76-serviceca\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.680772 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.680562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-ovn-node-metrics-cert\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.692530 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.692475 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:31.692530 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.692473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb8z9\" (UniqueName: \"kubernetes.io/projected/b2fc59c3-8d03-4d73-94bf-91312f60a7c5-kube-api-access-gb8z9\") pod \"ovnkube-node-r8dsh\" (UID: \"b2fc59c3-8d03-4d73-94bf-91312f60a7c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:31.692530 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.692502 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:31.692530 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.692515 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:31.692819 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:31.692580 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:32.192563951 +0000 UTC m=+3.101291029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:31.693727 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.693624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9hn\" (UniqueName: \"kubernetes.io/projected/89263846-e4e9-4dd4-bdb0-dfa9ae5e8995-kube-api-access-8g9hn\") pod \"multus-additional-cni-plugins-78nlz\" (UID: \"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995\") " pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.694062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.694042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-777kw\" (UniqueName: \"kubernetes.io/projected/6064206c-e379-4668-9aa8-a2165341d497-kube-api-access-777kw\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:31.694835 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.694809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhzj\" (UniqueName: \"kubernetes.io/projected/330a6d22-074b-44ae-8b4f-50637cd24561-kube-api-access-vdhzj\") pod \"iptables-alerter-vcl9m\" (UID: \"330a6d22-074b-44ae-8b4f-50637cd24561\") " pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.695211 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.695192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmbkg\" (UniqueName: \"kubernetes.io/projected/6e370d23-be74-44de-9ef0-318adb824e76-kube-api-access-bmbkg\") pod \"node-ca-7br24\" (UID: \"6e370d23-be74-44de-9ef0-318adb824e76\") " pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.764638 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.764614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mbbjn" Apr 17 16:52:31.773348 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.773319 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" Apr 17 16:52:31.782126 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.782101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" Apr 17 16:52:31.789395 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.789376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:31.795923 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.795901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wnr7" Apr 17 16:52:31.803481 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.803463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-78nlz" Apr 17 16:52:31.812023 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.812005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vcl9m" Apr 17 16:52:31.818538 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.818520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7br24" Apr 17 16:52:31.824117 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:31.824101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:32.182208 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.182119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:32.182348 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.182247 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:32.182348 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.182308 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:33.182290804 +0000 UTC m=+4.091017896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:32.282622 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.282575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:32.282803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.282748 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:32.282803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.282774 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:32.282803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.282785 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:32.282919 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:32.282839 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:33.282825055 +0000 UTC m=+4.191552126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:32.436827 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.436611 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842dbf17_4840_4948_b464_a2890415d77a.slice/crio-3349e2236b33679c81e3875c15194fd020adb0fa8736a1876c0b55319236bf79 WatchSource:0}: Error finding container 3349e2236b33679c81e3875c15194fd020adb0fa8736a1876c0b55319236bf79: Status 404 returned error can't find the container with id 3349e2236b33679c81e3875c15194fd020adb0fa8736a1876c0b55319236bf79 Apr 17 16:52:32.437507 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.437481 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e1eaec4_d649_4750_ab80_f763a4edde6a.slice/crio-9b74245dc1501f4232c5690781432bee9d4df98f04607a374e360ff3ac9ad5d5 WatchSource:0}: Error finding container 9b74245dc1501f4232c5690781432bee9d4df98f04607a374e360ff3ac9ad5d5: Status 404 returned error can't find the container with id 9b74245dc1501f4232c5690781432bee9d4df98f04607a374e360ff3ac9ad5d5 Apr 17 16:52:32.438699 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.438584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b961c80_2452_4eb9_ba7f_bd5743b6253f.slice/crio-db48daac6fda4594db517ed0a8ae247f8881d3212b08979aeb3014cbfff1b713 WatchSource:0}: Error finding container db48daac6fda4594db517ed0a8ae247f8881d3212b08979aeb3014cbfff1b713: Status 404 returned error can't find the container with id db48daac6fda4594db517ed0a8ae247f8881d3212b08979aeb3014cbfff1b713 Apr 17 16:52:32.439371 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.439304 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2fa510_3384_4880_a957_6404588186c2.slice/crio-fd7687dd71dae52920c6cb91fd34415b6846336c77dace3cc3abce9f7c8f36a5 WatchSource:0}: Error finding container fd7687dd71dae52920c6cb91fd34415b6846336c77dace3cc3abce9f7c8f36a5: Status 404 returned error can't find the container with id fd7687dd71dae52920c6cb91fd34415b6846336c77dace3cc3abce9f7c8f36a5 Apr 17 16:52:32.442142 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.442047 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fc59c3_8d03_4d73_94bf_91312f60a7c5.slice/crio-91fc134c02f638f967e29f080d0810a872210e0590f295e7f959003819b881e9 WatchSource:0}: Error finding container 91fc134c02f638f967e29f080d0810a872210e0590f295e7f959003819b881e9: Status 404 returned error can't find the container with id 91fc134c02f638f967e29f080d0810a872210e0590f295e7f959003819b881e9 Apr 17 16:52:32.444464 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.444438 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330a6d22_074b_44ae_8b4f_50637cd24561.slice/crio-c29bbe391f6b2fe34c30140333a757d11e07d91734a858bc30f637fe5c888183 WatchSource:0}: Error finding container c29bbe391f6b2fe34c30140333a757d11e07d91734a858bc30f637fe5c888183: Status 404 returned error can't find the container with id c29bbe391f6b2fe34c30140333a757d11e07d91734a858bc30f637fe5c888183 Apr 17 16:52:32.445721 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:52:32.445448 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ac56b0_5d8a_47e5_a66b_12a3994a9837.slice/crio-db07b242ec48c3070e1e33fb1afab530229922ae40e2564ced07e3ce2bb47149 WatchSource:0}: Error finding container db07b242ec48c3070e1e33fb1afab530229922ae40e2564ced07e3ce2bb47149: Status 404 returned error can't find the container with id db07b242ec48c3070e1e33fb1afab530229922ae40e2564ced07e3ce2bb47149 Apr 17 16:52:32.525330 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.525299 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:30 +0000 UTC" deadline="2027-10-21 19:20:19.137459958 +0000 UTC" Apr 17 16:52:32.525330 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.525328 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13250h27m46.612134619s" Apr 17 16:52:32.592199 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.592161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vcl9m" event={"ID":"330a6d22-074b-44ae-8b4f-50637cd24561","Type":"ContainerStarted","Data":"c29bbe391f6b2fe34c30140333a757d11e07d91734a858bc30f637fe5c888183"} Apr 17 16:52:32.593096 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.593072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" event={"ID":"f3ac56b0-5d8a-47e5-a66b-12a3994a9837","Type":"ContainerStarted","Data":"db07b242ec48c3070e1e33fb1afab530229922ae40e2564ced07e3ce2bb47149"} Apr 17 16:52:32.594008 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.593981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" event={"ID":"1d2fa510-3384-4880-a957-6404588186c2","Type":"ContainerStarted","Data":"fd7687dd71dae52920c6cb91fd34415b6846336c77dace3cc3abce9f7c8f36a5"} Apr 17 16:52:32.595054 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.595031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mbbjn" event={"ID":"8e1eaec4-d649-4750-ab80-f763a4edde6a","Type":"ContainerStarted","Data":"9b74245dc1501f4232c5690781432bee9d4df98f04607a374e360ff3ac9ad5d5"} Apr 17 16:52:32.596444 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.596423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" event={"ID":"12719cc175836885556f66e7ce6f8019","Type":"ContainerStarted","Data":"00c45cd571848946bd68b337d34d890ba956f2dbfb17aba5c171a77a8092993d"} Apr 17 16:52:32.597393 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.597368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7br24" event={"ID":"6e370d23-be74-44de-9ef0-318adb824e76","Type":"ContainerStarted","Data":"3cd750ead027984f82a0b03cb280e69f7fe0bf42a20cf3339b013e37ffd10728"} Apr 17 16:52:32.598181 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.598162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerStarted","Data":"6fe75dba9586c353d5cf4913acd94c1bc37c3433c91c8d092e154b032b51b031"} Apr 17 16:52:32.599120 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.599079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"91fc134c02f638f967e29f080d0810a872210e0590f295e7f959003819b881e9"} Apr 17 16:52:32.600231 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.600192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pckwb" event={"ID":"3b961c80-2452-4eb9-ba7f-bd5743b6253f","Type":"ContainerStarted","Data":"db48daac6fda4594db517ed0a8ae247f8881d3212b08979aeb3014cbfff1b713"} Apr 17 16:52:32.601079 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.601059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wnr7" event={"ID":"842dbf17-4840-4948-b464-a2890415d77a","Type":"ContainerStarted","Data":"3349e2236b33679c81e3875c15194fd020adb0fa8736a1876c0b55319236bf79"} Apr 17 16:52:32.612344 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:32.612297 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-199.ec2.internal" podStartSLOduration=2.61228323 podStartE2EDuration="2.61228323s" podCreationTimestamp="2026-04-17 16:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:32.611783299 +0000 UTC m=+3.520510393" watchObservedRunningTime="2026-04-17 16:52:32.61228323 +0000 UTC m=+3.521010324" Apr 17 16:52:33.193653 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.193614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:33.193872 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.193769 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:33.193872 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.193833 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:35.193814203 +0000 UTC m=+6.102541278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:33.293965 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.293938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:33.294132 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.294113 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:33.294207 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.294142 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:33.294207 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.294155 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:33.294303 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.294206 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:35.294188861 +0000 UTC m=+6.202915945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:33.587418 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.587345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:33.587835 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.587483 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:33.587835 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.587564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:33.587835 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:33.587647 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:33.634616 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.634504 2575 generic.go:358] "Generic (PLEG): container finished" podID="1805319a9755b3230cda0406644f8c58" containerID="1a5604e7cc69f20091265e7c5424f0fff65f93ae13f62e162b8d0520cd06d72e" exitCode=0 Apr 17 16:52:33.635192 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:33.635162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" event={"ID":"1805319a9755b3230cda0406644f8c58","Type":"ContainerDied","Data":"1a5604e7cc69f20091265e7c5424f0fff65f93ae13f62e162b8d0520cd06d72e"} Apr 17 16:52:34.641659 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:34.641402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" event={"ID":"1805319a9755b3230cda0406644f8c58","Type":"ContainerStarted","Data":"cf042be196d430c5f518294ec464552efc07c197bf44f1b10189a4f89e4c4f80"} Apr 17 16:52:34.659787 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:34.659738 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-199.ec2.internal" podStartSLOduration=4.659722274 podStartE2EDuration="4.659722274s" podCreationTimestamp="2026-04-17 16:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:34.659064228 +0000 UTC m=+5.567791320" watchObservedRunningTime="2026-04-17 16:52:34.659722274 +0000 UTC m=+5.568449368" Apr 17 16:52:35.207242 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.207097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:35.207427 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.207263 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:35.207427 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.207341 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:39.207320564 +0000 UTC m=+10.116047648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:35.307944 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.307891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:35.308132 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.308115 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:35.308197 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.308140 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:35.308197 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.308153 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:35.308291 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.308210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:39.308193447 +0000 UTC m=+10.216920522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:35.585198 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.585131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:35.585341 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.585269 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:35.585518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.585496 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:35.585648 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.585620 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:35.858809 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.858045 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5pjdb"] Apr 17 16:52:35.860881 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.860858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:35.861011 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:35.860933 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:35.912730 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.912647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:35.912730 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.912718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-dbus\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:35.912936 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:35.912751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-kubelet-config\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.013976 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.013915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-dbus\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.013976 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.013964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-kubelet-config\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.014188 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.014047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.014188 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.014094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-kubelet-config\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.014188 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.014137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/897a5f4a-9096-4195-a831-b7c123ac02f5-dbus\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.014188 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:36.014197 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:36.014188 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:36.014255 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:36.514238128 +0000 UTC m=+7.422965203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:36.517447 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:36.517417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:36.517642 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:36.517582 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:36.517706 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:36.517671 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:37.517652444 +0000 UTC m=+8.426379518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:37.526573 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:37.526509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:37.527192 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:37.526697 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:37.527192 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:37.526779 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:39.52675941 +0000 UTC m=+10.435486484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:37.585722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:37.585853 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:37.586234 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:37.586318 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:37.586406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:37.586741 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:37.586487 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:39.241547 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.241470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:39.242112 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.241630 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:39.242112 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.241717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:47.241699847 +0000 UTC m=+18.150426920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:39.343275 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.342705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:39.343275 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.342865 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:39.343275 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.342887 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:39.343275 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.342899 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:39.343275 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.342951 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:47.342935039 +0000 UTC m=+18.251662116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:39.543890 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.543757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:39.544064 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.543938 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:39.544064 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.544047 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:43.544028857 +0000 UTC m=+14.452755942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:39.587039 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.586717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:39.587039 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.586828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:39.587039 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.586833 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:39.587039 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:39.586935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:39.587039 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.586999 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:39.588822 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:39.586975 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:41.585549 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:41.585466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:41.585990 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:41.585572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:41.585990 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:41.585612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:41.585990 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:41.585713 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:41.585990 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:41.585750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:41.585990 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:41.585816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:43.578012 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:43.577971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:43.578385 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:43.578106 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:43.578385 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:43.578178 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:51.578158457 +0000 UTC m=+22.486885544 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:43.585827 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:43.585804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:43.585827 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:43.585815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:43.586008 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:43.585819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:43.586008 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:43.585915 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:43.586105 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:43.586035 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:43.586199 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:43.586110 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:45.585304 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:45.585262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:45.585304 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:45.585288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:45.585822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:45.585262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:45.585822 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:45.585387 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:45.585822 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:45.585482 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:45.585822 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:45.585556 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:47.303012 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:47.302972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:47.303526 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.303090 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:47.303526 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.303149 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:03.30313347 +0000 UTC m=+34.211860547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:47.403839 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:47.403802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:47.404021 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.403991 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:47.404021 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.404015 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:47.404129 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.404030 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:47.404129 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.404096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:03.404077787 +0000 UTC m=+34.312804861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:47.585954 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:47.585778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:47.585954 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:47.585778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:47.586159 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.585917 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:47.586159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:47.586044 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:47.586159 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.586039 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:47.586159 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:47.586150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.586924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:49.587051 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.587135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:49.587201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.588303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:49.589644 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:49.588381 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:49.668630 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.668396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerStarted","Data":"5b93dadf23fabde5b3e9fbc416a323398dcf43d1671041c14f4880ff12cac544"} Apr 17 16:52:49.670126 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.670099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"ec4eb5086a80cc863a5bc1d29a62f0868616d5631e61b455d1a6fea54dda215c"} Apr 17 16:52:49.671859 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.671832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pckwb" event={"ID":"3b961c80-2452-4eb9-ba7f-bd5743b6253f","Type":"ContainerStarted","Data":"b561e37d28aedc5890cc94f92ec4d8133a3d3dbc2dddc625b02cc1cf6129df76"} Apr 17 16:52:49.674152 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.674128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wnr7" event={"ID":"842dbf17-4840-4948-b464-a2890415d77a","Type":"ContainerStarted","Data":"fa1be3c48147b0d83f7452dcbced4aec9235e19de0515090594566e05c613699"} Apr 17 16:52:49.675614 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.675582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" event={"ID":"f3ac56b0-5d8a-47e5-a66b-12a3994a9837","Type":"ContainerStarted","Data":"17b16c4b90b7b38baa8f7626902a605b929a466a435cfa0da0af535355807b55"} Apr 17 16:52:49.676626 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.676606 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" event={"ID":"1d2fa510-3384-4880-a957-6404588186c2","Type":"ContainerStarted","Data":"c063d816dfd744db465acf445efcb2c36ceb17496853989ce040ac72049130cc"} Apr 17 16:52:49.706188 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.706151 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pckwb" podStartSLOduration=8.507312831 podStartE2EDuration="20.706127162s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.441216684 +0000 UTC m=+3.349943769" lastFinishedPulling="2026-04-17 16:52:44.640031012 +0000 UTC m=+15.548758100" observedRunningTime="2026-04-17 16:52:49.70554698 +0000 UTC m=+20.614274073" watchObservedRunningTime="2026-04-17 16:52:49.706127162 +0000 UTC m=+20.614854255" Apr 17 16:52:49.722248 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.722208 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-g4mwl" podStartSLOduration=3.780435831 podStartE2EDuration="20.722196372s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.441970626 +0000 UTC m=+3.350697698" lastFinishedPulling="2026-04-17 16:52:49.383731168 +0000 UTC m=+20.292458239" observedRunningTime="2026-04-17 16:52:49.721523076 +0000 UTC m=+20.630250171" watchObservedRunningTime="2026-04-17 16:52:49.722196372 +0000 UTC m=+20.630923465" Apr 17 16:52:49.750924 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:49.750888 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4wnr7" podStartSLOduration=3.795137126 podStartE2EDuration="20.750875648s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.438145403 +0000 UTC m=+3.346872474" lastFinishedPulling="2026-04-17 16:52:49.393883926 +0000 UTC m=+20.302610996" observedRunningTime="2026-04-17 16:52:49.750712095 +0000 UTC m=+20.659439188" watchObservedRunningTime="2026-04-17 16:52:49.750875648 +0000 UTC m=+20.659602741" Apr 17 16:52:50.679071 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.678867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7br24" event={"ID":"6e370d23-be74-44de-9ef0-318adb824e76","Type":"ContainerStarted","Data":"5426e533dfc6efbf90208bc716bbc59b44733365a7d95eb302224df913d9aa65"} Apr 17 16:52:50.680256 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.680205 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="5b93dadf23fabde5b3e9fbc416a323398dcf43d1671041c14f4880ff12cac544" exitCode=0 Apr 17 16:52:50.680329 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.680271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"5b93dadf23fabde5b3e9fbc416a323398dcf43d1671041c14f4880ff12cac544"} Apr 17 16:52:50.682765 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.682748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:52:50.683074 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683053 2575 generic.go:358] "Generic (PLEG): container finished" podID="b2fc59c3-8d03-4d73-94bf-91312f60a7c5" containerID="1e9ffbd0e3bb6e9eb9872db383c4781611b908f8c0c1217c944d8bd86ca64fe5" exitCode=1 Apr 17 16:52:50.683128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"ebc3a6434bea80f280552245ab9e3b65a50a9074cbb924cc0962b4b015557112"} Apr 17 16:52:50.683194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"aebc0815dd8089a6aa0973928663e985c731cb41f65f1a0d6a9e56a55a82c516"} Apr 17 16:52:50.683194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"2a72fdbcbe948539200d6af0b5975a158e676d664e645ae27bd8260d0ed4eb68"} Apr 17 16:52:50.683194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"2c22b69dc2921f257d05274896b540487a984de831bfb30b5564459a5081184e"} Apr 17 16:52:50.683194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.683168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerDied","Data":"1e9ffbd0e3bb6e9eb9872db383c4781611b908f8c0c1217c944d8bd86ca64fe5"} Apr 17 16:52:50.684401 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.684383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vcl9m" event={"ID":"330a6d22-074b-44ae-8b4f-50637cd24561","Type":"ContainerStarted","Data":"7c3e02b824e4381ee3437fcbe007a7623aba92a89a7ac7340df9f07279cd131f"} Apr 17 16:52:50.685541 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.685510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mbbjn" event={"ID":"8e1eaec4-d649-4750-ab80-f763a4edde6a","Type":"ContainerStarted","Data":"e390d62e7bb88e0657ccd39ffd8f021b5dd279611f6de918dcde4f4e1c93dfb9"} Apr 17 16:52:50.693769 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.693736 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7br24" podStartSLOduration=3.806356924 podStartE2EDuration="20.693726297s" podCreationTimestamp="2026-04-17 16:52:30 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.448368026 +0000 UTC m=+3.357095111" lastFinishedPulling="2026-04-17 16:52:49.335737413 +0000 UTC m=+20.244464484" observedRunningTime="2026-04-17 16:52:50.693378381 +0000 UTC m=+21.602105473" watchObservedRunningTime="2026-04-17 16:52:50.693726297 +0000 UTC m=+21.602453389" Apr 17 16:52:50.736897 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.735955 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mbbjn" podStartSLOduration=4.789939309 podStartE2EDuration="21.735939249s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.439736358 +0000 UTC m=+3.348463437" lastFinishedPulling="2026-04-17 16:52:49.385736299 +0000 UTC m=+20.294463377" observedRunningTime="2026-04-17 16:52:50.706973869 +0000 UTC m=+21.615700963" watchObservedRunningTime="2026-04-17 16:52:50.735939249 +0000 UTC m=+21.644666347" Apr 17 16:52:50.753514 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:50.753475 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vcl9m" podStartSLOduration=4.816896221 podStartE2EDuration="21.753464247s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.447161624 +0000 UTC m=+3.355888701" lastFinishedPulling="2026-04-17 16:52:49.383729652 +0000 UTC m=+20.292456727" observedRunningTime="2026-04-17 16:52:50.753291167 +0000 UTC m=+21.662018261" watchObservedRunningTime="2026-04-17 16:52:50.753464247 +0000 UTC m=+21.662191340" Apr 17 16:52:51.000955 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.000922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:51.001801 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.001775 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:51.096561 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.096536 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:52:51.537864 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.537651 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:52:51.096555523Z","UUID":"3c4caca9-0e42-4628-ae18-d92f70a0e6e7","Handler":null,"Name":"","Endpoint":""} Apr 17 16:52:51.539881 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.539852 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:52:51.540018 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.539889 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:52:51.585584 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.585555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:51.585765 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.585556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:51.585851 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:51.585795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:51.585851 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:51.585677 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:51.585851 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.585832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:51.585997 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:51.585901 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:51.637489 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.637459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:51.637659 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:51.637571 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:51.637659 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:51.637634 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret podName:897a5f4a-9096-4195-a831-b7c123ac02f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:07.637620774 +0000 UTC m=+38.546347849 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret") pod "global-pull-secret-syncer-5pjdb" (UID: "897a5f4a-9096-4195-a831-b7c123ac02f5") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:51.687966 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.687934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" event={"ID":"f3ac56b0-5d8a-47e5-a66b-12a3994a9837","Type":"ContainerStarted","Data":"dddce28cfd144ba295b308381fcf8589c69114915ebbe007f9683c394be5f8a8"} Apr 17 16:52:51.688340 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.688274 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:51.688822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:51.688804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pckwb" Apr 17 16:52:52.692118 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:52.692044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" event={"ID":"f3ac56b0-5d8a-47e5-a66b-12a3994a9837","Type":"ContainerStarted","Data":"9ee3285582ff7f680f636bdb27ffac033511b4d2ae2c0732cb1272cabce471ab"} Apr 17 16:52:52.694971 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:52.694950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:52:52.695327 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:52.695306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"955bad69b5a3bce0d0e952b0fa64f532b9fc2b462fe2131b2d34cf3fd13870f5"} Apr 17 16:52:52.715408 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:52.715350 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shr6w" podStartSLOduration=3.9225098689999998 podStartE2EDuration="23.715330403s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.448703754 +0000 UTC m=+3.357430840" lastFinishedPulling="2026-04-17 16:52:52.2415243 +0000 UTC m=+23.150251374" observedRunningTime="2026-04-17 16:52:52.711393452 +0000 UTC m=+23.620120560" watchObservedRunningTime="2026-04-17 16:52:52.715330403 +0000 UTC m=+23.624057497" Apr 17 16:52:53.585408 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:53.585367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:53.585610 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:53.585425 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:53.585610 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:53.585367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:53.585610 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:53.585499 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:53.585767 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:53.585640 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:53.585767 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:53.585718 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:55.585140 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:55.585096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:55.585811 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:55.585149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:55.585811 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:55.585246 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:55.585811 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:55.585290 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:55.585811 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:55.585335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:55.585811 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:55.585439 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:56.704450 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.704252 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="d686399b0174110d069c7d68eb782e7c0b77f353f1338aeed799dafa4bc896c5" exitCode=0 Apr 17 16:52:56.704994 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.704288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"d686399b0174110d069c7d68eb782e7c0b77f353f1338aeed799dafa4bc896c5"} Apr 17 16:52:56.707571 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.707554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:52:56.707930 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.707910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"5ddc3865f8e44d3d3642526c85736372e1801fbc6d9f898582f2a235485340c9"} Apr 17 16:52:56.708194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.708181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:56.708194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.708196 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:56.708314 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.708204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:56.708360 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.708334 2575 scope.go:117] "RemoveContainer" containerID="1e9ffbd0e3bb6e9eb9872db383c4781611b908f8c0c1217c944d8bd86ca64fe5" Apr 17 16:52:56.723573 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.723551 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:56.724870 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:56.724854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:52:57.584987 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.584965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:57.585073 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.584965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:57.585110 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.585069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:57.585174 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.585155 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:57.585213 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.584974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:57.585254 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.585239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:57.663504 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.663476 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5pjdb"] Apr 17 16:52:57.666686 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.666656 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fhjz7"] Apr 17 16:52:57.669566 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.669543 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pm56t"] Apr 17 16:52:57.713351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.713290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:52:57.713783 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.713655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" event={"ID":"b2fc59c3-8d03-4d73-94bf-91312f60a7c5","Type":"ContainerStarted","Data":"6d5fc9e7f70c3e3a90edfe4d3d9d3623da596ef26f21bc6f36a3268b7f7f88bd"} Apr 17 16:52:57.715416 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.715393 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="06952f754910727648c75e67c78dfc0d956f04b925ce66ad9115dba916507e4d" exitCode=0 Apr 17 16:52:57.715509 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.715451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:57.715509 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.715467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"06952f754910727648c75e67c78dfc0d956f04b925ce66ad9115dba916507e4d"} Apr 17 16:52:57.715611 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.715553 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:57.715611 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.715581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:57.715736 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.715716 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:57.715788 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.715752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:57.715849 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:57.715830 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:52:57.744843 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:57.744801 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" podStartSLOduration=10.722709143 podStartE2EDuration="27.744787673s" podCreationTimestamp="2026-04-17 16:52:30 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.44595777 +0000 UTC m=+3.354684842" lastFinishedPulling="2026-04-17 16:52:49.4680363 +0000 UTC m=+20.376763372" observedRunningTime="2026-04-17 16:52:57.744355355 +0000 UTC m=+28.653082448" watchObservedRunningTime="2026-04-17 16:52:57.744787673 +0000 UTC m=+28.653514766" Apr 17 16:52:58.719273 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:58.719240 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="f6d99bb4bb2757e9c895b7853cd947d89c9941b0c505409de05e10e3b7be8ec1" exitCode=0 Apr 17 16:52:58.719566 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:58.719281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"f6d99bb4bb2757e9c895b7853cd947d89c9941b0c505409de05e10e3b7be8ec1"} Apr 17 16:52:59.587101 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:59.586749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:52:59.587101 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:59.586817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:52:59.587101 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:59.587063 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:52:59.587497 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:52:59.586835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:52:59.587497 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:59.587151 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:52:59.587497 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:52:59.587189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:53:01.585046 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:01.585011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:53:01.585621 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:01.585021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:53:01.585621 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:01.585150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5pjdb" podUID="897a5f4a-9096-4195-a831-b7c123ac02f5" Apr 17 16:53:01.585621 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:01.585210 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pm56t" podUID="6064206c-e379-4668-9aa8-a2165341d497" Apr 17 16:53:01.585621 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:01.585245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:01.585621 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:01.585361 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhjz7" podUID="134235c0-0964-4070-b83c-8e7e912a6f98" Apr 17 16:53:02.447347 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.447273 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-199.ec2.internal" event="NodeReady" Apr 17 16:53:02.447491 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.447410 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:53:02.491236 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.491205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:53:02.495580 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.495557 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8"] Apr 17 16:53:02.495843 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.495816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.498902 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.498881 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8"] Apr 17 16:53:02.499060 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.499036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.502698 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.502648 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc"] Apr 17 16:53:02.502795 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.502778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.505585 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.505423 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:53:02.505755 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.505739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.507256 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.507232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:53:02.507342 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.507319 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"work-manager-hub-kubeconfig\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"open-cluster-management-agent-addon\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" type="*v1.Secret" Apr 17 16:53:02.507879 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.507851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:53:02.507879 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.507869 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:53:02.508024 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.507876 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:53:02.508577 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.508557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:53:02.508577 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.508569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:53:02.509018 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.508918 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:53:02.509375 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.509353 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:53:02.509505 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.509490 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-97wlb\"" Apr 17 16:53:02.509575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.509535 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:53:02.513272 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.513246 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"managed-serviceaccount-hub-kubeconfig\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"open-cluster-management-agent-addon\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" type="*v1.Secret" Apr 17 16:53:02.517637 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.517611 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"managed-serviceaccount-dockercfg-d9r9v\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"open-cluster-management-agent-addon\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d9r9v\"" type="*v1.Secret" Apr 17 16:53:02.522089 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.522067 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:53:02.522222 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.522177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:53:02.527399 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.527377 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hhhcz"] Apr 17 16:53:02.530751 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.530729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8"] Apr 17 16:53:02.530871 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.530853 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.534206 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.534188 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:53:02.534786 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.534755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:53:02.534974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.534947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:53:02.551874 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.551847 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8"] Apr 17 16:53:02.553517 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.553496 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc"] Apr 17 16:53:02.579831 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.579799 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhhcz"] Apr 17 16:53:02.611774 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.611741 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-khd9q"] Apr 17 16:53:02.614895 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.614874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.617688 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.617657 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:53:02.618484 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.618350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:53:02.618484 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.618366 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:53:02.618484 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.618360 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:53:02.623078 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c5f1815-93bf-45d0-9519-bd8768ff8182-tmp\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.623078 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623235 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.623235 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0e1678f-0432-409a-8f70-dde3c3bb6e48-tmp-dir\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.623235 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623235 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4jk\" (UniqueName: \"kubernetes.io/projected/f0e1678f-0432-409a-8f70-dde3c3bb6e48-kube-api-access-vt4jk\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/838b96cd-3475-4982-8c03-e9fe3ad0b069-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2ln\" (UniqueName: \"kubernetes.io/projected/5c5f1815-93bf-45d0-9519-bd8768ff8182-kube-api-access-fs2ln\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0e1678f-0432-409a-8f70-dde3c3bb6e48-config-volume\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/00acb16e-9010-42aa-9571-0bd0faf78721-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.623813 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.624147 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb9h\" (UniqueName: \"kubernetes.io/projected/00acb16e-9010-42aa-9571-0bd0faf78721-kube-api-access-rvb9h\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.624147 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2g76\" (UniqueName: \"kubernetes.io/projected/838b96cd-3475-4982-8c03-e9fe3ad0b069-kube-api-access-w2g76\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.624147 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6b4d\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.624147 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.623931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.624955 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.624935 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-khd9q"] Apr 17 16:53:02.724520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c5f1815-93bf-45d0-9519-bd8768ff8182-tmp\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.724520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.724520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0e1678f-0432-409a-8f70-dde3c3bb6e48-tmp-dir\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjl5g\" (UniqueName: \"kubernetes.io/projected/e1156e13-0a5f-456f-9c4b-7491034e6fa0-kube-api-access-jjl5g\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4jk\" (UniqueName: \"kubernetes.io/projected/f0e1678f-0432-409a-8f70-dde3c3bb6e48-kube-api-access-vt4jk\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/838b96cd-3475-4982-8c03-e9fe3ad0b069-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.724810 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2ln\" (UniqueName: \"kubernetes.io/projected/5c5f1815-93bf-45d0-9519-bd8768ff8182-kube-api-access-fs2ln\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0e1678f-0432-409a-8f70-dde3c3bb6e48-config-volume\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0e1678f-0432-409a-8f70-dde3c3bb6e48-tmp-dir\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.725042 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/00acb16e-9010-42aa-9571-0bd0faf78721-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.725096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:03.225079506 +0000 UTC m=+34.133806596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb9h\" (UniqueName: \"kubernetes.io/projected/00acb16e-9010-42aa-9571-0bd0faf78721-kube-api-access-rvb9h\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2g76\" (UniqueName: \"kubernetes.io/projected/838b96cd-3475-4982-8c03-e9fe3ad0b069-kube-api-access-w2g76\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6b4d\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.725283 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.726018 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.726018 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.724983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c5f1815-93bf-45d0-9519-bd8768ff8182-tmp\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.726018 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.725845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.726803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.726577 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:02.726803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.726619 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:02.726803 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.726662 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:03.226647735 +0000 UTC m=+34.135374810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:02.727031 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.726914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/00acb16e-9010-42aa-9571-0bd0faf78721-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.727031 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.726999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.727242 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.727220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0e1678f-0432-409a-8f70-dde3c3bb6e48-config-volume\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.730155 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-ca\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.730155 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.730291 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.730326 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.730326 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.730769 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.730753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/00acb16e-9010-42aa-9571-0bd0faf78721-hub\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.736233 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.736196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4jk\" (UniqueName: \"kubernetes.io/projected/f0e1678f-0432-409a-8f70-dde3c3bb6e48-kube-api-access-vt4jk\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:02.737897 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.737873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.738965 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.738938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2ln\" (UniqueName: \"kubernetes.io/projected/5c5f1815-93bf-45d0-9519-bd8768ff8182-kube-api-access-fs2ln\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:02.739427 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.739279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2g76\" (UniqueName: \"kubernetes.io/projected/838b96cd-3475-4982-8c03-e9fe3ad0b069-kube-api-access-w2g76\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:02.739515 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.739472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6b4d\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:02.740535 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.740513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb9h\" (UniqueName: \"kubernetes.io/projected/00acb16e-9010-42aa-9571-0bd0faf78721-kube-api-access-rvb9h\") pod \"cluster-proxy-proxy-agent-78686898c4-qh2t8\" (UID: \"00acb16e-9010-42aa-9571-0bd0faf78721\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.826116 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.826076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.826295 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.826144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjl5g\" (UniqueName: \"kubernetes.io/projected/e1156e13-0a5f-456f-9c4b-7491034e6fa0-kube-api-access-jjl5g\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.826295 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.826234 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:02.826407 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:02.826302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:03.326280821 +0000 UTC m=+34.235007908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:02.829190 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.829164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:53:02.836952 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.836930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjl5g\" (UniqueName: \"kubernetes.io/projected/e1156e13-0a5f-456f-9c4b-7491034e6fa0-kube-api-access-jjl5g\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:02.966026 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:02.965829 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8"] Apr 17 16:53:03.231455 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.231417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:03.231665 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.231552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:03.231665 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.231605 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:03.231665 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.231625 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:03.231828 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.231677 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:03.231828 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.231689 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:04.231667805 +0000 UTC m=+35.140394882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:03.231828 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.231728 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:04.231711923 +0000 UTC m=+35.140438998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:03.331926 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.331887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:53:03.332090 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.332051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:03.332090 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.332070 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:03.332195 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.332141 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:35.332124141 +0000 UTC m=+66.240851211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:03.332195 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.332174 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:03.332284 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.332232 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:04.332221509 +0000 UTC m=+35.240948581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:03.433064 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.432999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:03.433246 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.433182 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:53:03.433246 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.433207 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:53:03.433246 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.433220 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dtjbr for pod openshift-network-diagnostics/network-check-target-fhjz7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:03.433389 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.433285 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr podName:134235c0-0964-4070-b83c-8e7e912a6f98 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:35.433265653 +0000 UTC m=+66.341992731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dtjbr" (UniqueName: "kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr") pod "network-check-target-fhjz7" (UID: "134235c0-0964-4070-b83c-8e7e912a6f98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:03.585141 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.585102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:53:03.585335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.585103 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:03.585335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.585260 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:53:03.588175 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588156 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:53:03.588683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588507 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:53:03.588683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588526 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:03.588683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588538 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:03.588683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:53:03.588683 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.588543 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:53:03.608539 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.608520 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 16:53:03.621471 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:03.621424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/838b96cd-3475-4982-8c03-e9fe3ad0b069-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-567d8777f9-5xdhc\" (UID: \"838b96cd-3475-4982-8c03-e9fe3ad0b069\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:03.724988 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.724952 2575 secret.go:189] Couldn't get secret open-cluster-management-agent-addon/work-manager-hub-kubeconfig: failed to sync secret cache: timed out waiting for the condition Apr 17 16:53:03.725151 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:03.725054 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config podName:5c5f1815-93bf-45d0-9519-bd8768ff8182 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:04.225036792 +0000 UTC m=+35.133763869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "klusterlet-config" (UniqueName: "kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config") pod "klusterlet-addon-workmgr-56676d479c-2gzc8" (UID: "5c5f1815-93bf-45d0-9519-bd8768ff8182") : failed to sync secret cache: timed out waiting for the condition Apr 17 16:53:04.044477 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.044441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d9r9v\"" Apr 17 16:53:04.044693 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.044676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" Apr 17 16:53:04.100977 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.100949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:53:04.240495 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.240460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:04.240692 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.240522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:04.240692 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.240575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:04.240692 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.240633 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:04.240692 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.240649 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:04.240692 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.240665 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:04.240928 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.240711 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:06.240693047 +0000 UTC m=+37.149420118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:04.240928 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.240730 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:06.240721942 +0000 UTC m=+37.149449014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:04.243315 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.243291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5c5f1815-93bf-45d0-9519-bd8768ff8182-klusterlet-config\") pod \"klusterlet-addon-workmgr-56676d479c-2gzc8\" (UID: \"5c5f1815-93bf-45d0-9519-bd8768ff8182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:04.338247 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.338163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:04.341220 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:04.341195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:04.341341 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.341318 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:04.341433 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:04.341382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:06.341364388 +0000 UTC m=+37.250091472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:05.046979 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:05.046941 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00acb16e_9010_42aa_9571_0bd0faf78721.slice/crio-994c505651c8ff2f33a61f76accb65e40d7965bb1cd9b733a5b1ce00fef20b45 WatchSource:0}: Error finding container 994c505651c8ff2f33a61f76accb65e40d7965bb1cd9b733a5b1ce00fef20b45: Status 404 returned error can't find the container with id 994c505651c8ff2f33a61f76accb65e40d7965bb1cd9b733a5b1ce00fef20b45 Apr 17 16:53:05.187929 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.187907 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8"] Apr 17 16:53:05.190362 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:05.190330 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c5f1815_93bf_45d0_9519_bd8768ff8182.slice/crio-6b549ba728e0e9e4003359ec6eb5ee48317ca269fbd695eea71f1d6ab5a61431 WatchSource:0}: Error finding container 6b549ba728e0e9e4003359ec6eb5ee48317ca269fbd695eea71f1d6ab5a61431: Status 404 returned error can't find the container with id 6b549ba728e0e9e4003359ec6eb5ee48317ca269fbd695eea71f1d6ab5a61431 Apr 17 16:53:05.193261 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.193214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc"] Apr 17 16:53:05.195774 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:05.195741 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod838b96cd_3475_4982_8c03_e9fe3ad0b069.slice/crio-fc66eb5dd479eb3b2136f6f022cefc3a4d1c16ee75e4711ae33713b0ae1cf478 WatchSource:0}: Error finding container fc66eb5dd479eb3b2136f6f022cefc3a4d1c16ee75e4711ae33713b0ae1cf478: Status 404 returned error can't find the container with id fc66eb5dd479eb3b2136f6f022cefc3a4d1c16ee75e4711ae33713b0ae1cf478 Apr 17 16:53:05.733548 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.733511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" event={"ID":"5c5f1815-93bf-45d0-9519-bd8768ff8182","Type":"ContainerStarted","Data":"6b549ba728e0e9e4003359ec6eb5ee48317ca269fbd695eea71f1d6ab5a61431"} Apr 17 16:53:05.734511 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.734489 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerStarted","Data":"994c505651c8ff2f33a61f76accb65e40d7965bb1cd9b733a5b1ce00fef20b45"} Apr 17 16:53:05.736605 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.736566 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="e4913800a0b290e14e6ff54be90cf9a00f4388f9e43c3383f49a23bb3971ea97" exitCode=0 Apr 17 16:53:05.736707 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.736649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"e4913800a0b290e14e6ff54be90cf9a00f4388f9e43c3383f49a23bb3971ea97"} Apr 17 16:53:05.737612 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:05.737579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" event={"ID":"838b96cd-3475-4982-8c03-e9fe3ad0b069","Type":"ContainerStarted","Data":"fc66eb5dd479eb3b2136f6f022cefc3a4d1c16ee75e4711ae33713b0ae1cf478"} Apr 17 16:53:06.258248 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:06.258161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:06.258248 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:06.258221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:06.258814 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.258388 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:06.258814 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.258404 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:06.258814 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.258458 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:10.258441836 +0000 UTC m=+41.167168922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:06.259010 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.258992 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:06.259080 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.259067 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:10.259028632 +0000 UTC m=+41.167755726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:06.359217 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:06.359180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:06.359389 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.359345 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:06.359446 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:06.359406 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:10.359387161 +0000 UTC m=+41.268114237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:06.748649 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:06.747834 2575 generic.go:358] "Generic (PLEG): container finished" podID="89263846-e4e9-4dd4-bdb0-dfa9ae5e8995" containerID="b7920ab8ac13e5ecf02b8ea81a6dfc02b3c6952cae6deb72e0d493644121d077" exitCode=0 Apr 17 16:53:06.748649 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:06.747883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerDied","Data":"b7920ab8ac13e5ecf02b8ea81a6dfc02b3c6952cae6deb72e0d493644121d077"} Apr 17 16:53:07.672739 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:07.672640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:53:07.681516 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:07.681470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/897a5f4a-9096-4195-a831-b7c123ac02f5-original-pull-secret\") pod \"global-pull-secret-syncer-5pjdb\" (UID: \"897a5f4a-9096-4195-a831-b7c123ac02f5\") " pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:53:07.753393 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:07.753364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78nlz" event={"ID":"89263846-e4e9-4dd4-bdb0-dfa9ae5e8995","Type":"ContainerStarted","Data":"ef8b0923150dd6fb1582112b885ca60729cb053216dfad2751d9635bc4f489eb"} Apr 17 16:53:07.785407 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:07.785356 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-78nlz" podStartSLOduration=5.934910234 podStartE2EDuration="38.785340606s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:52:32.449537196 +0000 UTC m=+3.358264267" lastFinishedPulling="2026-04-17 16:53:05.29996755 +0000 UTC m=+36.208694639" observedRunningTime="2026-04-17 16:53:07.784786104 +0000 UTC m=+38.693513196" watchObservedRunningTime="2026-04-17 16:53:07.785340606 +0000 UTC m=+38.694067701" Apr 17 16:53:07.812088 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:07.812055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5pjdb" Apr 17 16:53:10.290235 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:10.290198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:10.290255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.290378 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.290433 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.290451 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:18.29043477 +0000 UTC m=+49.199161846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.290454 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:10.290913 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.290520 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:18.290502108 +0000 UTC m=+49.199229180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:10.391191 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:10.391163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:10.391352 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.391272 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:10.391352 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:10.391319 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:18.391304213 +0000 UTC m=+49.300031283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:11.489019 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.488993 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5pjdb"] Apr 17 16:53:11.492044 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:11.492021 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897a5f4a_9096_4195_a831_b7c123ac02f5.slice/crio-58426b92ff28bfe1fc5f327d27af2ac77ee4729663d9e8991ba5b8dc62e8d395 WatchSource:0}: Error finding container 58426b92ff28bfe1fc5f327d27af2ac77ee4729663d9e8991ba5b8dc62e8d395: Status 404 returned error can't find the container with id 58426b92ff28bfe1fc5f327d27af2ac77ee4729663d9e8991ba5b8dc62e8d395 Apr 17 16:53:11.762199 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.762122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerStarted","Data":"704e8a9ee6402a4ec0a14065b3cf2b346130d8cc8c7a428b336951ffb2eff53d"} Apr 17 16:53:11.763411 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.763381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" event={"ID":"838b96cd-3475-4982-8c03-e9fe3ad0b069","Type":"ContainerStarted","Data":"c4cd6c69987d8cad15720549988156e520e697e4a961d6eb8f5bad954c70a0aa"} Apr 17 16:53:11.764605 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.764570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" event={"ID":"5c5f1815-93bf-45d0-9519-bd8768ff8182","Type":"ContainerStarted","Data":"4f345a40baa9a6bccc92993442d7da16ba1bd7dbce9bd7b76f54da75171fbb27"} Apr 17 16:53:11.764780 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.764762 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:11.765630 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.765611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5pjdb" event={"ID":"897a5f4a-9096-4195-a831-b7c123ac02f5","Type":"ContainerStarted","Data":"58426b92ff28bfe1fc5f327d27af2ac77ee4729663d9e8991ba5b8dc62e8d395"} Apr 17 16:53:11.766469 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.766454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" Apr 17 16:53:11.781672 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.781633 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-567d8777f9-5xdhc" podStartSLOduration=36.684814151 podStartE2EDuration="42.781620614s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:53:05.278338678 +0000 UTC m=+36.187065749" lastFinishedPulling="2026-04-17 16:53:11.375145127 +0000 UTC m=+42.283872212" observedRunningTime="2026-04-17 16:53:11.779980427 +0000 UTC m=+42.688707522" watchObservedRunningTime="2026-04-17 16:53:11.781620614 +0000 UTC m=+42.690347703" Apr 17 16:53:11.796662 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:11.796588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-56676d479c-2gzc8" podStartSLOduration=36.599413586 podStartE2EDuration="42.796580051s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:53:05.192347313 +0000 UTC m=+36.101074387" lastFinishedPulling="2026-04-17 16:53:11.389513776 +0000 UTC m=+42.298240852" observedRunningTime="2026-04-17 16:53:11.796299852 +0000 UTC m=+42.705026955" watchObservedRunningTime="2026-04-17 16:53:11.796580051 +0000 UTC m=+42.705307145" Apr 17 16:53:14.774805 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:14.774770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerStarted","Data":"4c41a126adbdd5b87fe119bd3f4522fc77cea99534ff3f4cdb1abfcc7454518f"} Apr 17 16:53:14.774805 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:14.774810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerStarted","Data":"c8cc10c10e9cb89bf971a45b42d9326380c4c0faab2daec459a641514deb24f3"} Apr 17 16:53:14.795381 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:14.795329 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" podStartSLOduration=37.011491587 podStartE2EDuration="45.795312942s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:53:05.05061107 +0000 UTC m=+35.959338158" lastFinishedPulling="2026-04-17 16:53:13.834432428 +0000 UTC m=+44.743159513" observedRunningTime="2026-04-17 16:53:14.793395309 +0000 UTC m=+45.702122403" watchObservedRunningTime="2026-04-17 16:53:14.795312942 +0000 UTC m=+45.704040037" Apr 17 16:53:16.780886 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:16.780853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5pjdb" event={"ID":"897a5f4a-9096-4195-a831-b7c123ac02f5","Type":"ContainerStarted","Data":"ecece4c994c180024af2fafbc22afb823c07595f0550d16fecf8d4c428bb166a"} Apr 17 16:53:16.797778 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:16.797731 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5pjdb" podStartSLOduration=37.475629061 podStartE2EDuration="41.797717484s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:53:11.494100564 +0000 UTC m=+42.402827635" lastFinishedPulling="2026-04-17 16:53:15.816188973 +0000 UTC m=+46.724916058" observedRunningTime="2026-04-17 16:53:16.797145809 +0000 UTC m=+47.705872903" watchObservedRunningTime="2026-04-17 16:53:16.797717484 +0000 UTC m=+47.706444576" Apr 17 16:53:18.352282 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:18.352241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:18.352293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.352387 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.352435 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.352450 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.352452 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:34.352437239 +0000 UTC m=+65.261164315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:18.352701 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.352495 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:34.352481554 +0000 UTC m=+65.261208628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:18.453459 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:18.453422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:18.453587 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.453556 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:18.453649 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:18.453639 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:34.453623344 +0000 UTC m=+65.362350416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:27.171834 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:27.171807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mbbjn_8e1eaec4-d649-4750-ab80-f763a4edde6a/dns-node-resolver/0.log" Apr 17 16:53:28.173349 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:28.173321 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7br24_6e370d23-be74-44de-9ef0-318adb824e76/node-ca/0.log" Apr 17 16:53:28.729733 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:28.729705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8dsh" Apr 17 16:53:34.374210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:34.374155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:53:34.374210 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:34.374215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:53:34.374670 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.374311 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:34.374670 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.374310 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:34.374670 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.374381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls podName:f0e1678f-0432-409a-8f70-dde3c3bb6e48 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:06.37436584 +0000 UTC m=+97.283092916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls") pod "dns-default-hhhcz" (UID: "f0e1678f-0432-409a-8f70-dde3c3bb6e48") : secret "dns-default-metrics-tls" not found Apr 17 16:53:34.374670 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.374322 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f8ff56558-nn2d2: secret "image-registry-tls" not found Apr 17 16:53:34.374670 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.374443 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls podName:b97bfbb7-1c20-4c13-a110-b6ef178cf124 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:06.374430866 +0000 UTC m=+97.283157937 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls") pod "image-registry-6f8ff56558-nn2d2" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124") : secret "image-registry-tls" not found Apr 17 16:53:34.475186 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:34.475133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:53:34.475345 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.475283 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:34.475383 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:34.475350 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert podName:e1156e13-0a5f-456f-9c4b-7491034e6fa0 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:06.47533319 +0000 UTC m=+97.384060267 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert") pod "ingress-canary-khd9q" (UID: "e1156e13-0a5f-456f-9c4b-7491034e6fa0") : secret "canary-serving-cert" not found Apr 17 16:53:35.382990 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.382948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:53:35.385875 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.385855 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:35.394194 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:35.394164 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:53:35.394261 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:53:35.394237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs podName:6064206c-e379-4668-9aa8-a2165341d497 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:39.394220802 +0000 UTC m=+130.302947873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs") pod "network-metrics-daemon-pm56t" (UID: "6064206c-e379-4668-9aa8-a2165341d497") : secret "metrics-daemon-secret" not found Apr 17 16:53:35.483531 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.483496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:35.486392 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.486374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:53:35.497346 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.497323 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:53:35.508334 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.508298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjbr\" (UniqueName: \"kubernetes.io/projected/134235c0-0964-4070-b83c-8e7e912a6f98-kube-api-access-dtjbr\") pod \"network-check-target-fhjz7\" (UID: \"134235c0-0964-4070-b83c-8e7e912a6f98\") " pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:35.708585 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.708507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:35.716875 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.716845 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:35.875153 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:35.875129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fhjz7"] Apr 17 16:53:35.878372 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:35.878345 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod134235c0_0964_4070_b83c_8e7e912a6f98.slice/crio-edef051c75346f3cb62b571ab272c9f4102d025c89ded5afe62cbeabfb2c93c2 WatchSource:0}: Error finding container edef051c75346f3cb62b571ab272c9f4102d025c89ded5afe62cbeabfb2c93c2: Status 404 returned error can't find the container with id edef051c75346f3cb62b571ab272c9f4102d025c89ded5afe62cbeabfb2c93c2 Apr 17 16:53:36.833467 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:36.833423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhjz7" event={"ID":"134235c0-0964-4070-b83c-8e7e912a6f98","Type":"ContainerStarted","Data":"edef051c75346f3cb62b571ab272c9f4102d025c89ded5afe62cbeabfb2c93c2"} Apr 17 16:53:39.841750 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:39.841716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhjz7" event={"ID":"134235c0-0964-4070-b83c-8e7e912a6f98","Type":"ContainerStarted","Data":"c1d678f7e8d53f3283f2e9f385fce32a0db9136abf70978e83608da1ee331afe"} Apr 17 16:53:39.842150 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:39.841854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:53:39.861051 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:39.861002 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fhjz7" podStartSLOduration=67.946214719 podStartE2EDuration="1m10.860982499s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:53:35.880445459 +0000 UTC m=+66.789172529" lastFinishedPulling="2026-04-17 16:53:38.795213238 +0000 UTC m=+69.703940309" observedRunningTime="2026-04-17 16:53:39.859819012 +0000 UTC m=+70.768546108" watchObservedRunningTime="2026-04-17 16:53:39.860982499 +0000 UTC m=+70.769709594" Apr 17 16:53:51.116669 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.116634 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7c82q"] Apr 17 16:53:51.121743 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.121727 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.124463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.124440 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:53:51.124570 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.124503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:53:51.125535 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.125519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:53:51.125829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.125812 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:53:51.125879 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.125844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-whh7q\"" Apr 17 16:53:51.140375 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.140353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7c82q"] Apr 17 16:53:51.206292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.206252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.206494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.206351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ccf217d-8ec0-486b-806c-87d885bd71dd-crio-socket\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.206494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.206376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ccf217d-8ec0-486b-806c-87d885bd71dd-data-volume\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.206494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.206470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6f4b\" (UniqueName: \"kubernetes.io/projected/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-api-access-j6f4b\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.206699 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.206565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ccf217d-8ec0-486b-806c-87d885bd71dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.306939 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.306904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ccf217d-8ec0-486b-806c-87d885bd71dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307121 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.306960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307121 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.306997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ccf217d-8ec0-486b-806c-87d885bd71dd-crio-socket\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307121 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.307014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ccf217d-8ec0-486b-806c-87d885bd71dd-data-volume\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307121 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.307036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6f4b\" (UniqueName: \"kubernetes.io/projected/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-api-access-j6f4b\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307250 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.307115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0ccf217d-8ec0-486b-806c-87d885bd71dd-crio-socket\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307490 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.307467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ccf217d-8ec0-486b-806c-87d885bd71dd-data-volume\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.307668 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.307653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.309321 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.309307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0ccf217d-8ec0-486b-806c-87d885bd71dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.320153 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.320133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6f4b\" (UniqueName: \"kubernetes.io/projected/0ccf217d-8ec0-486b-806c-87d885bd71dd-kube-api-access-j6f4b\") pod \"insights-runtime-extractor-7c82q\" (UID: \"0ccf217d-8ec0-486b-806c-87d885bd71dd\") " pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.430538 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.430458 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7c82q" Apr 17 16:53:51.544707 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.544673 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7c82q"] Apr 17 16:53:51.548532 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:53:51.548509 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ccf217d_8ec0_486b_806c_87d885bd71dd.slice/crio-85f9a7d5442a6b3e2c4775e7f5718f409bb3afa14f86e43cf36953f43d355f6a WatchSource:0}: Error finding container 85f9a7d5442a6b3e2c4775e7f5718f409bb3afa14f86e43cf36953f43d355f6a: Status 404 returned error can't find the container with id 85f9a7d5442a6b3e2c4775e7f5718f409bb3afa14f86e43cf36953f43d355f6a Apr 17 16:53:51.871760 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.871722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7c82q" event={"ID":"0ccf217d-8ec0-486b-806c-87d885bd71dd","Type":"ContainerStarted","Data":"3f9e77d5586bc7339f763bad89d7f10e03f675aa11c9dad53ff8a29cc14353dc"} Apr 17 16:53:51.871760 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:51.871761 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7c82q" event={"ID":"0ccf217d-8ec0-486b-806c-87d885bd71dd","Type":"ContainerStarted","Data":"85f9a7d5442a6b3e2c4775e7f5718f409bb3afa14f86e43cf36953f43d355f6a"} Apr 17 16:53:52.876444 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:52.876409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7c82q" event={"ID":"0ccf217d-8ec0-486b-806c-87d885bd71dd","Type":"ContainerStarted","Data":"447735977b885de136bc2d62c56bb6539916112135e538572b94bec2a8a2f3d4"} Apr 17 16:53:53.884204 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:53.884170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7c82q" event={"ID":"0ccf217d-8ec0-486b-806c-87d885bd71dd","Type":"ContainerStarted","Data":"9aa98e350953ae31be4fded457a25a74b1d897cc9a349e1fe458f374b6e612f4"} Apr 17 16:53:53.902281 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:53:53.902189 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7c82q" podStartSLOduration=0.871712932 podStartE2EDuration="2.902176976s" podCreationTimestamp="2026-04-17 16:53:51 +0000 UTC" firstStartedPulling="2026-04-17 16:53:51.606116567 +0000 UTC m=+82.514843638" lastFinishedPulling="2026-04-17 16:53:53.636580612 +0000 UTC m=+84.545307682" observedRunningTime="2026-04-17 16:53:53.900780053 +0000 UTC m=+84.809507142" watchObservedRunningTime="2026-04-17 16:53:53.902176976 +0000 UTC m=+84.810904069" Apr 17 16:54:03.446444 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.446411 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-frx24"] Apr 17 16:54:03.450646 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.450623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.453509 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.453489 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:54:03.454654 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454630 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:54:03.454654 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454646 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:54:03.454840 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454666 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:54:03.454840 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454672 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:54:03.454840 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454672 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:54:03.454840 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.454665 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z9mwd\"" Apr 17 16:54:03.599267 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-sys\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-root\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-textfile\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-metrics-client-ca\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-accelerators-collector-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599618 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mkg\" (UniqueName: \"kubernetes.io/projected/f23f7a47-f779-461b-a627-6aa06ced398c-kube-api-access-n4mkg\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599618 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-tls\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.599618 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.599526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-wtmp\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-root\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-textfile\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-metrics-client-ca\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-root\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-accelerators-collector-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mkg\" (UniqueName: \"kubernetes.io/projected/f23f7a47-f779-461b-a627-6aa06ced398c-kube-api-access-n4mkg\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-tls\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700560 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-wtmp\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700996 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-sys\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700996 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-sys\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700996 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-textfile\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.700996 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.700837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-wtmp\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.701195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.701081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-metrics-client-ca\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.701195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.701127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-accelerators-collector-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.702534 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.702514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.702834 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.702814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f23f7a47-f779-461b-a627-6aa06ced398c-node-exporter-tls\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.709502 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.709483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mkg\" (UniqueName: \"kubernetes.io/projected/f23f7a47-f779-461b-a627-6aa06ced398c-kube-api-access-n4mkg\") pod \"node-exporter-frx24\" (UID: \"f23f7a47-f779-461b-a627-6aa06ced398c\") " pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.759507 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.759489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-frx24" Apr 17 16:54:03.767163 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:54:03.767139 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23f7a47_f779_461b_a627_6aa06ced398c.slice/crio-872663761d06bb65f033d7ff35111d4a661108fedf0e797617acebbc62c23239 WatchSource:0}: Error finding container 872663761d06bb65f033d7ff35111d4a661108fedf0e797617acebbc62c23239: Status 404 returned error can't find the container with id 872663761d06bb65f033d7ff35111d4a661108fedf0e797617acebbc62c23239 Apr 17 16:54:03.909520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:03.909483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frx24" event={"ID":"f23f7a47-f779-461b-a627-6aa06ced398c","Type":"ContainerStarted","Data":"872663761d06bb65f033d7ff35111d4a661108fedf0e797617acebbc62c23239"} Apr 17 16:54:04.913022 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:04.912936 2575 generic.go:358] "Generic (PLEG): container finished" podID="f23f7a47-f779-461b-a627-6aa06ced398c" containerID="275394775cd264c728df5e5ea14735de900fc2350c001f8534592b1634a0df95" exitCode=0 Apr 17 16:54:04.913022 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:04.912988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frx24" event={"ID":"f23f7a47-f779-461b-a627-6aa06ced398c","Type":"ContainerDied","Data":"275394775cd264c728df5e5ea14735de900fc2350c001f8534592b1634a0df95"} Apr 17 16:54:05.917510 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:05.917440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frx24" event={"ID":"f23f7a47-f779-461b-a627-6aa06ced398c","Type":"ContainerStarted","Data":"8a31adab63b21ebbd73767789a7827ab3b3cf977d100fd1e738f452cd6b40341"} Apr 17 16:54:05.917510 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:05.917479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-frx24" event={"ID":"f23f7a47-f779-461b-a627-6aa06ced398c","Type":"ContainerStarted","Data":"fba21fbb9f261c381dfff179be9d6802f08b11ce2b975b610c1f2dca32b88b1f"} Apr 17 16:54:05.944081 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:05.944035 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-frx24" podStartSLOduration=2.118615048 podStartE2EDuration="2.944023951s" podCreationTimestamp="2026-04-17 16:54:03 +0000 UTC" firstStartedPulling="2026-04-17 16:54:03.768895632 +0000 UTC m=+94.677622703" lastFinishedPulling="2026-04-17 16:54:04.594304518 +0000 UTC m=+95.503031606" observedRunningTime="2026-04-17 16:54:05.943918814 +0000 UTC m=+96.852645917" watchObservedRunningTime="2026-04-17 16:54:05.944023951 +0000 UTC m=+96.852751043" Apr 17 16:54:06.421166 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.421131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:54:06.421325 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.421174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:06.423416 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.423392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0e1678f-0432-409a-8f70-dde3c3bb6e48-metrics-tls\") pod \"dns-default-hhhcz\" (UID: \"f0e1678f-0432-409a-8f70-dde3c3bb6e48\") " pod="openshift-dns/dns-default-hhhcz" Apr 17 16:54:06.423567 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.423549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"image-registry-6f8ff56558-nn2d2\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:06.462959 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.462939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:54:06.470314 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.470293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhhcz" Apr 17 16:54:06.521619 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.521570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:54:06.523879 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.523856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1156e13-0a5f-456f-9c4b-7491034e6fa0-cert\") pod \"ingress-canary-khd9q\" (UID: \"e1156e13-0a5f-456f-9c4b-7491034e6fa0\") " pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:54:06.528250 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.527932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:54:06.535576 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.535457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-khd9q" Apr 17 16:54:06.614152 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.614040 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhhcz"] Apr 17 16:54:06.618015 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:54:06.617984 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e1678f_0432_409a_8f70_dde3c3bb6e48.slice/crio-e87ca0dbf569a45eee67c3ae9c84ccee9500f9f728a61325a32dd7c0865fbd4d WatchSource:0}: Error finding container e87ca0dbf569a45eee67c3ae9c84ccee9500f9f728a61325a32dd7c0865fbd4d: Status 404 returned error can't find the container with id e87ca0dbf569a45eee67c3ae9c84ccee9500f9f728a61325a32dd7c0865fbd4d Apr 17 16:54:06.648868 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.648843 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-khd9q"] Apr 17 16:54:06.651364 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:54:06.651341 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1156e13_0a5f_456f_9c4b_7491034e6fa0.slice/crio-efc218f20d68054ece3ec6bd7140be6285ebae229fc0751a0619f139c4d4c6cf WatchSource:0}: Error finding container efc218f20d68054ece3ec6bd7140be6285ebae229fc0751a0619f139c4d4c6cf: Status 404 returned error can't find the container with id efc218f20d68054ece3ec6bd7140be6285ebae229fc0751a0619f139c4d4c6cf Apr 17 16:54:06.713316 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.713241 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-97wlb\"" Apr 17 16:54:06.720864 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.720843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:06.848231 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.848198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:54:06.850999 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:54:06.850973 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb97bfbb7_1c20_4c13_a110_b6ef178cf124.slice/crio-8860abf28977aab4845d5ddeef4a2ebf1eb5e3f1a1f3cde529243db3a10ef5e4 WatchSource:0}: Error finding container 8860abf28977aab4845d5ddeef4a2ebf1eb5e3f1a1f3cde529243db3a10ef5e4: Status 404 returned error can't find the container with id 8860abf28977aab4845d5ddeef4a2ebf1eb5e3f1a1f3cde529243db3a10ef5e4 Apr 17 16:54:06.921091 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.921051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-khd9q" event={"ID":"e1156e13-0a5f-456f-9c4b-7491034e6fa0","Type":"ContainerStarted","Data":"efc218f20d68054ece3ec6bd7140be6285ebae229fc0751a0619f139c4d4c6cf"} Apr 17 16:54:06.922133 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.922110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhhcz" event={"ID":"f0e1678f-0432-409a-8f70-dde3c3bb6e48","Type":"ContainerStarted","Data":"e87ca0dbf569a45eee67c3ae9c84ccee9500f9f728a61325a32dd7c0865fbd4d"} Apr 17 16:54:06.923363 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.923338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" event={"ID":"b97bfbb7-1c20-4c13-a110-b6ef178cf124","Type":"ContainerStarted","Data":"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d"} Apr 17 16:54:06.923467 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.923372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" event={"ID":"b97bfbb7-1c20-4c13-a110-b6ef178cf124","Type":"ContainerStarted","Data":"8860abf28977aab4845d5ddeef4a2ebf1eb5e3f1a1f3cde529243db3a10ef5e4"} Apr 17 16:54:06.923467 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.923404 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:06.953695 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:06.953651 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" podStartSLOduration=96.953638477 podStartE2EDuration="1m36.953638477s" podCreationTimestamp="2026-04-17 16:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:06.953109853 +0000 UTC m=+97.861836947" watchObservedRunningTime="2026-04-17 16:54:06.953638477 +0000 UTC m=+97.862365569" Apr 17 16:54:08.931273 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:08.931247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-khd9q" event={"ID":"e1156e13-0a5f-456f-9c4b-7491034e6fa0","Type":"ContainerStarted","Data":"3f8ad20515895012a47366688d046a79683477f97c78589f101fef47470fa4c6"} Apr 17 16:54:08.932715 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:08.932689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhhcz" event={"ID":"f0e1678f-0432-409a-8f70-dde3c3bb6e48","Type":"ContainerStarted","Data":"5d24692e06176e3cecf25cad4597b2fa932b7656999c80302af27b8c9bde90c5"} Apr 17 16:54:08.954706 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:08.954664 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-khd9q" podStartSLOduration=64.761306681 podStartE2EDuration="1m6.954648909s" podCreationTimestamp="2026-04-17 16:53:02 +0000 UTC" firstStartedPulling="2026-04-17 16:54:06.653006074 +0000 UTC m=+97.561733146" lastFinishedPulling="2026-04-17 16:54:08.8463483 +0000 UTC m=+99.755075374" observedRunningTime="2026-04-17 16:54:08.953130864 +0000 UTC m=+99.861857958" watchObservedRunningTime="2026-04-17 16:54:08.954648909 +0000 UTC m=+99.863375993" Apr 17 16:54:09.937089 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:09.937057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhhcz" event={"ID":"f0e1678f-0432-409a-8f70-dde3c3bb6e48","Type":"ContainerStarted","Data":"8e78e53443016869db7f2bf25220e20ec8ebb449017ada7c43981a35130629a2"} Apr 17 16:54:09.937587 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:09.937147 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hhhcz" Apr 17 16:54:09.954952 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:09.954911 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hhhcz" podStartSLOduration=66.015485235 podStartE2EDuration="1m7.954901052s" podCreationTimestamp="2026-04-17 16:53:02 +0000 UTC" firstStartedPulling="2026-04-17 16:54:06.619853173 +0000 UTC m=+97.528580245" lastFinishedPulling="2026-04-17 16:54:08.55926898 +0000 UTC m=+99.467996062" observedRunningTime="2026-04-17 16:54:09.95432585 +0000 UTC m=+100.863052948" watchObservedRunningTime="2026-04-17 16:54:09.954901052 +0000 UTC m=+100.863628145" Apr 17 16:54:10.846787 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:10.846754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhjz7" Apr 17 16:54:13.619096 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:13.619062 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:54:19.941187 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:19.941159 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hhhcz" Apr 17 16:54:22.830608 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:22.830542 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" podUID="00acb16e-9010-42aa-9571-0bd0faf78721" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:54:23.624708 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:23.624680 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:32.830613 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:32.830554 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" podUID="00acb16e-9010-42aa-9571-0bd0faf78721" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:54:38.637387 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.637319 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" podUID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" containerName="registry" containerID="cri-o://550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d" gracePeriod=30 Apr 17 16:54:38.863826 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.863806 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:38.938563 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938497 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938563 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938532 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938563 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6b4d\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938575 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938620 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938655 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938682 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.938829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.938764 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted\") pod \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\" (UID: \"b97bfbb7-1c20-4c13-a110-b6ef178cf124\") " Apr 17 16:54:38.939063 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.939024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:38.939508 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.939413 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:38.941005 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.940946 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:38.941114 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.941034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:38.941177 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.941148 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d" (OuterVolumeSpecName: "kube-api-access-x6b4d") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "kube-api-access-x6b4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:38.941301 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.941281 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:38.941366 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.941344 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:38.947410 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:38.947386 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b97bfbb7-1c20-4c13-a110-b6ef178cf124" (UID: "b97bfbb7-1c20-4c13-a110-b6ef178cf124"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:39.010423 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.010397 2575 generic.go:358] "Generic (PLEG): container finished" podID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" containerID="550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d" exitCode=0 Apr 17 16:54:39.010520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.010449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" event={"ID":"b97bfbb7-1c20-4c13-a110-b6ef178cf124","Type":"ContainerDied","Data":"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d"} Apr 17 16:54:39.010520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.010455 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" Apr 17 16:54:39.010520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.010482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f8ff56558-nn2d2" event={"ID":"b97bfbb7-1c20-4c13-a110-b6ef178cf124","Type":"ContainerDied","Data":"8860abf28977aab4845d5ddeef4a2ebf1eb5e3f1a1f3cde529243db3a10ef5e4"} Apr 17 16:54:39.010520 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.010503 2575 scope.go:117] "RemoveContainer" containerID="550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d" Apr 17 16:54:39.018510 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.018497 2575 scope.go:117] "RemoveContainer" containerID="550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d" Apr 17 16:54:39.018767 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:54:39.018749 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d\": container with ID starting with 550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d not found: ID does not exist" containerID="550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d" Apr 17 16:54:39.018804 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.018774 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d"} err="failed to get container status \"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d\": rpc error: code = NotFound desc = could not find container \"550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d\": container with ID starting with 550f617fe542498fae9cdb594f11ffaf588b47b1b841d9137bed6e038defa13d not found: ID does not exist" Apr 17 16:54:39.035884 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.035865 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:54:39.040132 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040116 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-installation-pull-secrets\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040177 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040136 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-certificates\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040177 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040146 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6b4d\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-kube-api-access-x6b4d\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040177 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040154 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-registry-tls\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040177 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040167 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b97bfbb7-1c20-4c13-a110-b6ef178cf124-image-registry-private-configuration\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040296 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040180 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b97bfbb7-1c20-4c13-a110-b6ef178cf124-trusted-ca\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040296 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040189 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b97bfbb7-1c20-4c13-a110-b6ef178cf124-bound-sa-token\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.040296 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.040197 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b97bfbb7-1c20-4c13-a110-b6ef178cf124-ca-trust-extracted\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:54:39.043858 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.043842 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f8ff56558-nn2d2"] Apr 17 16:54:39.443651 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.443626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:54:39.445869 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.445848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6064206c-e379-4668-9aa8-a2165341d497-metrics-certs\") pod \"network-metrics-daemon-pm56t\" (UID: \"6064206c-e379-4668-9aa8-a2165341d497\") " pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:54:39.588765 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.588737 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" path="/var/lib/kubelet/pods/b97bfbb7-1c20-4c13-a110-b6ef178cf124/volumes" Apr 17 16:54:39.600323 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.600305 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:54:39.608074 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.608060 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pm56t" Apr 17 16:54:39.721141 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:39.721074 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pm56t"] Apr 17 16:54:39.723741 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:54:39.723711 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6064206c_e379_4668_9aa8_a2165341d497.slice/crio-93accb4283d923d60916159c13a717e25b8806190180b14fe697abe07b7c9bbb WatchSource:0}: Error finding container 93accb4283d923d60916159c13a717e25b8806190180b14fe697abe07b7c9bbb: Status 404 returned error can't find the container with id 93accb4283d923d60916159c13a717e25b8806190180b14fe697abe07b7c9bbb Apr 17 16:54:40.014584 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:40.014502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pm56t" event={"ID":"6064206c-e379-4668-9aa8-a2165341d497","Type":"ContainerStarted","Data":"93accb4283d923d60916159c13a717e25b8806190180b14fe697abe07b7c9bbb"} Apr 17 16:54:41.018628 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:41.018576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pm56t" event={"ID":"6064206c-e379-4668-9aa8-a2165341d497","Type":"ContainerStarted","Data":"2524a874f73b7e4243b4a417c3acc575dbb3f502c3bf2691c65e69435b4be1b3"} Apr 17 16:54:41.018914 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:41.018631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pm56t" event={"ID":"6064206c-e379-4668-9aa8-a2165341d497","Type":"ContainerStarted","Data":"ec04474fd5539a979251d30b8c220b1f54c0e0fcf5e89ffb58aaf5bd467e8960"} Apr 17 16:54:41.042862 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:41.042780 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pm56t" podStartSLOduration=130.980346688 podStartE2EDuration="2m12.042767736s" podCreationTimestamp="2026-04-17 16:52:29 +0000 UTC" firstStartedPulling="2026-04-17 16:54:39.725554869 +0000 UTC m=+130.634281940" lastFinishedPulling="2026-04-17 16:54:40.787975914 +0000 UTC m=+131.696702988" observedRunningTime="2026-04-17 16:54:41.042560967 +0000 UTC m=+131.951288060" watchObservedRunningTime="2026-04-17 16:54:41.042767736 +0000 UTC m=+131.951494830" Apr 17 16:54:42.830671 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:42.830630 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" podUID="00acb16e-9010-42aa-9571-0bd0faf78721" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:54:42.831003 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:42.830697 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" Apr 17 16:54:42.831142 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:42.831125 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"4c41a126adbdd5b87fe119bd3f4522fc77cea99534ff3f4cdb1abfcc7454518f"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 16:54:42.831178 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:42.831161 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" podUID="00acb16e-9010-42aa-9571-0bd0faf78721" containerName="service-proxy" containerID="cri-o://4c41a126adbdd5b87fe119bd3f4522fc77cea99534ff3f4cdb1abfcc7454518f" gracePeriod=30 Apr 17 16:54:43.025197 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:43.025168 2575 generic.go:358] "Generic (PLEG): container finished" podID="00acb16e-9010-42aa-9571-0bd0faf78721" containerID="4c41a126adbdd5b87fe119bd3f4522fc77cea99534ff3f4cdb1abfcc7454518f" exitCode=2 Apr 17 16:54:43.025314 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:43.025238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerDied","Data":"4c41a126adbdd5b87fe119bd3f4522fc77cea99534ff3f4cdb1abfcc7454518f"} Apr 17 16:54:43.025314 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:54:43.025277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78686898c4-qh2t8" event={"ID":"00acb16e-9010-42aa-9571-0bd0faf78721","Type":"ContainerStarted","Data":"c33e2e206f6309fe954552c3db4b3ce7ff3447393594bcef559f6cea8a9e322f"} Apr 17 16:57:29.471669 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:57:29.471641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:57:29.472153 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:57:29.471675 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 16:57:29.474944 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:57:29.474924 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:58:06.735994 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.735952 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8"] Apr 17 16:58:06.738335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.736277 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" containerName="registry" Apr 17 16:58:06.738335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.736294 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" containerName="registry" Apr 17 16:58:06.738335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.736351 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b97bfbb7-1c20-4c13-a110-b6ef178cf124" containerName="registry" Apr 17 16:58:06.739163 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.739149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:06.741868 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.741842 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 16:58:06.742005 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.741907 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-cz7tl\"" Apr 17 16:58:06.742052 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.742010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:58:06.751002 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.750958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8"] Apr 17 16:58:06.908076 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.908039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:06.908241 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:06.908090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5v8\" (UniqueName: \"kubernetes.io/projected/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-kube-api-access-hw5v8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.009140 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.009041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.009140 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.009075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5v8\" (UniqueName: \"kubernetes.io/projected/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-kube-api-access-hw5v8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.009499 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.009473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.017737 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.017706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5v8\" (UniqueName: \"kubernetes.io/projected/9a39f5b9-ccd0-41e6-b0c1-70338cda18e5-kube-api-access-hw5v8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mhgf8\" (UID: \"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.047863 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.047834 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" Apr 17 16:58:07.163110 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.163075 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8"] Apr 17 16:58:07.167167 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:07.167137 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a39f5b9_ccd0_41e6_b0c1_70338cda18e5.slice/crio-07c51f95978fea7170791b51f469b5a447b4b06cea411916d38285de2c43edc5 WatchSource:0}: Error finding container 07c51f95978fea7170791b51f469b5a447b4b06cea411916d38285de2c43edc5: Status 404 returned error can't find the container with id 07c51f95978fea7170791b51f469b5a447b4b06cea411916d38285de2c43edc5 Apr 17 16:58:07.169409 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.169394 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:58:07.532811 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:07.532767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" event={"ID":"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5","Type":"ContainerStarted","Data":"07c51f95978fea7170791b51f469b5a447b4b06cea411916d38285de2c43edc5"} Apr 17 16:58:10.542232 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:10.542197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" event={"ID":"9a39f5b9-ccd0-41e6-b0c1-70338cda18e5","Type":"ContainerStarted","Data":"5394e352eef36ae7f3b604819e4685e3d3ea66dda032d9d28c90711e3a5bf98d"} Apr 17 16:58:10.563875 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:10.563811 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mhgf8" podStartSLOduration=1.976914563 podStartE2EDuration="4.563793558s" podCreationTimestamp="2026-04-17 16:58:06 +0000 UTC" firstStartedPulling="2026-04-17 16:58:07.169867784 +0000 UTC m=+338.078594859" lastFinishedPulling="2026-04-17 16:58:09.756746772 +0000 UTC m=+340.665473854" observedRunningTime="2026-04-17 16:58:10.561854217 +0000 UTC m=+341.470581324" watchObservedRunningTime="2026-04-17 16:58:10.563793558 +0000 UTC m=+341.472520652" Apr 17 16:58:12.814739 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.814683 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb"] Apr 17 16:58:12.820745 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.820712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:12.823375 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.823350 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:58:12.824193 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.824172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:58:12.824503 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.824484 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:58:12.826096 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.826072 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb"] Apr 17 16:58:12.950291 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.950251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:12.950291 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.950290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:12.950508 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:12.950378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kjq\" (UniqueName: \"kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.051245 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.051210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.051245 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.051244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.051441 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.051293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kjq\" (UniqueName: \"kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.051630 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.051611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.051689 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.051670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.061938 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.061916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kjq\" (UniqueName: \"kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.129790 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.129698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:13.246130 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.246097 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb"] Apr 17 16:58:13.248692 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:13.248663 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c19e7b8_cfa2_4e9d_9cf4_53fb39024e6d.slice/crio-34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b WatchSource:0}: Error finding container 34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b: Status 404 returned error can't find the container with id 34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b Apr 17 16:58:13.552829 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:13.552794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" event={"ID":"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d","Type":"ContainerStarted","Data":"34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b"} Apr 17 16:58:18.572629 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:18.572573 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerID="1afaf17a2e3e66dfdc85ae9d5093913504ad769745e313bb011086a0c5ff0a63" exitCode=0 Apr 17 16:58:18.573046 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:18.572637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" event={"ID":"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d","Type":"ContainerDied","Data":"1afaf17a2e3e66dfdc85ae9d5093913504ad769745e313bb011086a0c5ff0a63"} Apr 17 16:58:21.582305 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:21.582272 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerID="3e3635fe72c06c2b9d70b1ef9dc223f93327243a5c1efd7fa133c200d6a96f64" exitCode=0 Apr 17 16:58:21.582768 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:21.582353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" event={"ID":"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d","Type":"ContainerDied","Data":"3e3635fe72c06c2b9d70b1ef9dc223f93327243a5c1efd7fa133c200d6a96f64"} Apr 17 16:58:28.601033 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:28.601000 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerID="4d4c9358480b1f1393cb35705019c4a03b5549ac91f4d25b0e55190f01c01b7c" exitCode=0 Apr 17 16:58:28.601369 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:28.601067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" event={"ID":"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d","Type":"ContainerDied","Data":"4d4c9358480b1f1393cb35705019c4a03b5549ac91f4d25b0e55190f01c01b7c"} Apr 17 16:58:29.719983 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.719025 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:29.774548 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.774523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kjq\" (UniqueName: \"kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq\") pod \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " Apr 17 16:58:29.774708 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.774588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle\") pod \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " Apr 17 16:58:29.774708 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.774630 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util\") pod \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\" (UID: \"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d\") " Apr 17 16:58:29.774963 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.774940 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle" (OuterVolumeSpecName: "bundle") pod "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" (UID: "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:29.776650 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.776633 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq" (OuterVolumeSpecName: "kube-api-access-n7kjq") pod "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" (UID: "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d"). InnerVolumeSpecName "kube-api-access-n7kjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:29.779267 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.779247 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util" (OuterVolumeSpecName: "util") pod "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" (UID: "2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:29.875439 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.875384 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-bundle\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:29.875439 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.875405 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-util\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:29.875439 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:29.875415 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7kjq\" (UniqueName: \"kubernetes.io/projected/2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d-kube-api-access-n7kjq\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:30.607396 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:30.607360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" event={"ID":"2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d","Type":"ContainerDied","Data":"34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b"} Apr 17 16:58:30.607396 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:30.607396 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ef09527bcaeca8a7c8b97c73942be2b5d886054465aa302d0ac44d95f1547b" Apr 17 16:58:30.607617 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:30.607434 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgr7nb" Apr 17 16:58:32.331739 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331707 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-95574"] Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331924 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="extract" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331936 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="extract" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331951 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="pull" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331956 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="pull" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331963 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="util" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.331969 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="util" Apr 17 16:58:32.332105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.332009 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c19e7b8-cfa2-4e9d-9cf4-53fb39024e6d" containerName="extract" Apr 17 16:58:32.334105 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.334089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.337699 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.337677 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:58:32.338931 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.338913 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:58:32.339032 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.338911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-sstdc\"" Apr 17 16:58:32.346451 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.346432 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-95574"] Apr 17 16:58:32.388631 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.388609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jtk\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-kube-api-access-m2jtk\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.388723 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.388680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-bound-sa-token\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.489440 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.489418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-bound-sa-token\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.489527 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.489446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jtk\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-kube-api-access-m2jtk\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.498027 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.498002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-bound-sa-token\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.498131 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.498113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jtk\" (UniqueName: \"kubernetes.io/projected/e6158389-7bb6-4eab-b762-5264f84b7add-kube-api-access-m2jtk\") pod \"cert-manager-759f64656b-95574\" (UID: \"e6158389-7bb6-4eab-b762-5264f84b7add\") " pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.642491 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.642428 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-95574" Apr 17 16:58:32.752942 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:32.752912 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-95574"] Apr 17 16:58:32.756969 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:32.756933 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6158389_7bb6_4eab_b762_5264f84b7add.slice/crio-83f404854a5410645b81b8fb0717d31d2f513cc9da76a34a6a8a98449f3e86fa WatchSource:0}: Error finding container 83f404854a5410645b81b8fb0717d31d2f513cc9da76a34a6a8a98449f3e86fa: Status 404 returned error can't find the container with id 83f404854a5410645b81b8fb0717d31d2f513cc9da76a34a6a8a98449f3e86fa Apr 17 16:58:33.616919 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:33.616883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-95574" event={"ID":"e6158389-7bb6-4eab-b762-5264f84b7add","Type":"ContainerStarted","Data":"83f404854a5410645b81b8fb0717d31d2f513cc9da76a34a6a8a98449f3e86fa"} Apr 17 16:58:36.625976 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:36.625873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-95574" event={"ID":"e6158389-7bb6-4eab-b762-5264f84b7add","Type":"ContainerStarted","Data":"21653857939c03eb0d41b6f5b1ea07c6f76cd8575cd29c030d959c2d26dbea4f"} Apr 17 16:58:36.641074 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:36.641028 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-95574" podStartSLOduration=1.529832443 podStartE2EDuration="4.641016768s" podCreationTimestamp="2026-04-17 16:58:32 +0000 UTC" firstStartedPulling="2026-04-17 16:58:32.759296248 +0000 UTC m=+363.668023320" lastFinishedPulling="2026-04-17 16:58:35.870480564 +0000 UTC m=+366.779207645" observedRunningTime="2026-04-17 16:58:36.640401241 +0000 UTC m=+367.549128336" watchObservedRunningTime="2026-04-17 16:58:36.641016768 +0000 UTC m=+367.549743861" Apr 17 16:58:41.695530 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.695494 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l"] Apr 17 16:58:41.717786 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.717762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l"] Apr 17 16:58:41.717957 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.717886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.720995 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.720971 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:58:41.722268 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.722251 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:58:41.722399 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.722263 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:58:41.759116 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.759094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbxx\" (UniqueName: \"kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.759243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.759147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.759243 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.759167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.859522 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.859491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbxx\" (UniqueName: \"kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.859713 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.859553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.859713 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.859675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.859880 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.859865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.859991 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.859973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:41.867562 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:41.867533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbxx\" (UniqueName: \"kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:42.027889 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:42.027792 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:42.142318 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:42.142286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l"] Apr 17 16:58:42.145164 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:42.145133 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038d2685_4e0d_4c6c_b047_61a4853b3cb4.slice/crio-de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6 WatchSource:0}: Error finding container de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6: Status 404 returned error can't find the container with id de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6 Apr 17 16:58:42.640719 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:42.640683 2575 generic.go:358] "Generic (PLEG): container finished" podID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerID="08dc696f8a50bb7b76e5056555dfb4b35c2e2d296a6cb519077d04df2f9f56b7" exitCode=0 Apr 17 16:58:42.640901 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:42.640772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" event={"ID":"038d2685-4e0d-4c6c-b047-61a4853b3cb4","Type":"ContainerDied","Data":"08dc696f8a50bb7b76e5056555dfb4b35c2e2d296a6cb519077d04df2f9f56b7"} Apr 17 16:58:42.640901 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:42.640811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" event={"ID":"038d2685-4e0d-4c6c-b047-61a4853b3cb4","Type":"ContainerStarted","Data":"de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6"} Apr 17 16:58:43.645135 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:43.645106 2575 generic.go:358] "Generic (PLEG): container finished" podID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerID="6954f1808cf66d64ec449a0ef5fd816a0994e0dbd20e2103ee64fb688f55c7af" exitCode=0 Apr 17 16:58:43.645495 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:43.645170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" event={"ID":"038d2685-4e0d-4c6c-b047-61a4853b3cb4","Type":"ContainerDied","Data":"6954f1808cf66d64ec449a0ef5fd816a0994e0dbd20e2103ee64fb688f55c7af"} Apr 17 16:58:44.649887 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:44.649845 2575 generic.go:358] "Generic (PLEG): container finished" podID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerID="c74e4b930b4eae8616e99172dec0d957b8bc46f298dea68141d5febea6bdeac4" exitCode=0 Apr 17 16:58:44.650304 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:44.649924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" event={"ID":"038d2685-4e0d-4c6c-b047-61a4853b3cb4","Type":"ContainerDied","Data":"c74e4b930b4eae8616e99172dec0d957b8bc46f298dea68141d5febea6bdeac4"} Apr 17 16:58:45.768722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.768699 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:45.888385 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.888362 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kbxx\" (UniqueName: \"kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx\") pod \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " Apr 17 16:58:45.888521 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.888391 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util\") pod \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " Apr 17 16:58:45.888521 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.888415 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle\") pod \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\" (UID: \"038d2685-4e0d-4c6c-b047-61a4853b3cb4\") " Apr 17 16:58:45.889124 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.889098 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle" (OuterVolumeSpecName: "bundle") pod "038d2685-4e0d-4c6c-b047-61a4853b3cb4" (UID: "038d2685-4e0d-4c6c-b047-61a4853b3cb4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:45.890419 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.890396 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx" (OuterVolumeSpecName: "kube-api-access-7kbxx") pod "038d2685-4e0d-4c6c-b047-61a4853b3cb4" (UID: "038d2685-4e0d-4c6c-b047-61a4853b3cb4"). InnerVolumeSpecName "kube-api-access-7kbxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:45.894204 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.894174 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util" (OuterVolumeSpecName: "util") pod "038d2685-4e0d-4c6c-b047-61a4853b3cb4" (UID: "038d2685-4e0d-4c6c-b047-61a4853b3cb4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:45.989116 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.989032 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-bundle\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:45.989116 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.989073 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7kbxx\" (UniqueName: \"kubernetes.io/projected/038d2685-4e0d-4c6c-b047-61a4853b3cb4-kube-api-access-7kbxx\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:45.989116 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:45.989088 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/038d2685-4e0d-4c6c-b047-61a4853b3cb4-util\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:46.661245 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:46.661213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" event={"ID":"038d2685-4e0d-4c6c-b047-61a4853b3cb4","Type":"ContainerDied","Data":"de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6"} Apr 17 16:58:46.661245 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:46.661246 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de89bf8e509bdd4e23fcde4ab44879af286bd9812f4e616bc09465baa68747c6" Apr 17 16:58:46.661429 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:46.661245 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r8d9l" Apr 17 16:58:52.306146 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306115 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z"] Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306332 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="extract" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306341 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="extract" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306351 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="util" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306357 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="util" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306369 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="pull" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306375 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="pull" Apr 17 16:58:52.306513 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.306419 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="038d2685-4e0d-4c6c-b047-61a4853b3cb4" containerName="extract" Apr 17 16:58:52.310396 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.310379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.313417 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.313397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:58:52.313516 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.313496 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:58:52.314684 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.314669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:58:52.318068 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.318046 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z"] Apr 17 16:58:52.434718 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.434691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.434822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.434732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbgw\" (UniqueName: \"kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.434822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.434784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.535880 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.535856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.535973 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.535887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.535973 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.535915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbgw\" (UniqueName: \"kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.536209 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.536191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.536268 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.536252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.545733 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.545706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbgw\" (UniqueName: \"kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.619852 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.619784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:52.742415 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:52.742383 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z"] Apr 17 16:58:52.745456 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:52.745428 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc892cd_8406_40be_9f1e_aa2c0cc31e15.slice/crio-e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235 WatchSource:0}: Error finding container e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235: Status 404 returned error can't find the container with id e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235 Apr 17 16:58:53.684403 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:53.684364 2575 generic.go:358] "Generic (PLEG): container finished" podID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerID="fa6cb80d2feace6ad7890bd2ec9caaf29e6fc2889ec6ff15115d5ee1e25a1998" exitCode=0 Apr 17 16:58:53.684775 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:53.684452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" event={"ID":"fdc892cd-8406-40be-9f1e-aa2c0cc31e15","Type":"ContainerDied","Data":"fa6cb80d2feace6ad7890bd2ec9caaf29e6fc2889ec6ff15115d5ee1e25a1998"} Apr 17 16:58:53.684775 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:53.684528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" event={"ID":"fdc892cd-8406-40be-9f1e-aa2c0cc31e15","Type":"ContainerStarted","Data":"e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235"} Apr 17 16:58:54.254062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.254038 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg"] Apr 17 16:58:54.257068 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.257050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.260507 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.260484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:58:54.260632 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.260523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:58:54.260885 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.260858 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:58:54.260986 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.260954 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:58:54.261044 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.261002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-28vvs\"" Apr 17 16:58:54.272889 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.272868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg"] Apr 17 16:58:54.350343 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.350319 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hs6\" (UniqueName: \"kubernetes.io/projected/4b261f84-2f33-4cba-bb82-2e8401d96c9c-kube-api-access-g9hs6\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.350471 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.350363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.350529 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.350476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.451559 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.451532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hs6\" (UniqueName: \"kubernetes.io/projected/4b261f84-2f33-4cba-bb82-2e8401d96c9c-kube-api-access-g9hs6\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.451681 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.451565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.451681 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.451612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.453790 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.453768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.453906 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.453772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b261f84-2f33-4cba-bb82-2e8401d96c9c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.459622 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.459577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hs6\" (UniqueName: \"kubernetes.io/projected/4b261f84-2f33-4cba-bb82-2e8401d96c9c-kube-api-access-g9hs6\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-2c9lg\" (UID: \"4b261f84-2f33-4cba-bb82-2e8401d96c9c\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.567027 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.567002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:54.688096 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.688052 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg"] Apr 17 16:58:54.689391 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.689366 2575 generic.go:358] "Generic (PLEG): container finished" podID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerID="5236085ac95a2449d475418eca480718e5dc10ea2a7798a23d541cb25c4f750c" exitCode=0 Apr 17 16:58:54.689501 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:54.689421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" event={"ID":"fdc892cd-8406-40be-9f1e-aa2c0cc31e15","Type":"ContainerDied","Data":"5236085ac95a2449d475418eca480718e5dc10ea2a7798a23d541cb25c4f750c"} Apr 17 16:58:54.691048 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:58:54.691005 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b261f84_2f33_4cba_bb82_2e8401d96c9c.slice/crio-be22bd78a94ba8e0850e6dd779fe8f35f1a6f53536eb090191d02c91491a3d52 WatchSource:0}: Error finding container be22bd78a94ba8e0850e6dd779fe8f35f1a6f53536eb090191d02c91491a3d52: Status 404 returned error can't find the container with id be22bd78a94ba8e0850e6dd779fe8f35f1a6f53536eb090191d02c91491a3d52 Apr 17 16:58:55.695641 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:55.695510 2575 generic.go:358] "Generic (PLEG): container finished" podID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerID="312dc53d47354e98d66bf5e0b095de14caeae8bb9d7528433a54fdde8566f293" exitCode=0 Apr 17 16:58:55.695641 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:55.695619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" event={"ID":"fdc892cd-8406-40be-9f1e-aa2c0cc31e15","Type":"ContainerDied","Data":"312dc53d47354e98d66bf5e0b095de14caeae8bb9d7528433a54fdde8566f293"} Apr 17 16:58:55.697023 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:55.696989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" event={"ID":"4b261f84-2f33-4cba-bb82-2e8401d96c9c","Type":"ContainerStarted","Data":"be22bd78a94ba8e0850e6dd779fe8f35f1a6f53536eb090191d02c91491a3d52"} Apr 17 16:58:57.065863 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.065840 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:57.173463 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.173441 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nbgw\" (UniqueName: \"kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw\") pod \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " Apr 17 16:58:57.173544 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.173492 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util\") pod \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " Apr 17 16:58:57.173544 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.173538 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle\") pod \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\" (UID: \"fdc892cd-8406-40be-9f1e-aa2c0cc31e15\") " Apr 17 16:58:57.174223 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.174186 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle" (OuterVolumeSpecName: "bundle") pod "fdc892cd-8406-40be-9f1e-aa2c0cc31e15" (UID: "fdc892cd-8406-40be-9f1e-aa2c0cc31e15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:57.175384 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.175363 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw" (OuterVolumeSpecName: "kube-api-access-9nbgw") pod "fdc892cd-8406-40be-9f1e-aa2c0cc31e15" (UID: "fdc892cd-8406-40be-9f1e-aa2c0cc31e15"). InnerVolumeSpecName "kube-api-access-9nbgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:57.178873 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.178852 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util" (OuterVolumeSpecName: "util") pod "fdc892cd-8406-40be-9f1e-aa2c0cc31e15" (UID: "fdc892cd-8406-40be-9f1e-aa2c0cc31e15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:57.274776 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.274744 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-bundle\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:57.274776 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.274771 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9nbgw\" (UniqueName: \"kubernetes.io/projected/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-kube-api-access-9nbgw\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:57.274776 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.274780 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdc892cd-8406-40be-9f1e-aa2c0cc31e15-util\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:58:57.708901 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.708863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" event={"ID":"4b261f84-2f33-4cba-bb82-2e8401d96c9c","Type":"ContainerStarted","Data":"ad1a5ec723ead6cff001728ee33bc434adc3a95217a5b0321e28f52802f45cd3"} Apr 17 16:58:57.709074 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.709041 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:58:57.710451 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.710431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" event={"ID":"fdc892cd-8406-40be-9f1e-aa2c0cc31e15","Type":"ContainerDied","Data":"e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235"} Apr 17 16:58:57.710451 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.710452 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38965edcdff765415bebdda551d27f4554c49e3ab55fc22c5fcc47f24501235" Apr 17 16:58:57.710628 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.710501 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9jb67z" Apr 17 16:58:57.735232 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:58:57.735185 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" podStartSLOduration=1.32228252 podStartE2EDuration="3.735171468s" podCreationTimestamp="2026-04-17 16:58:54 +0000 UTC" firstStartedPulling="2026-04-17 16:58:54.692892195 +0000 UTC m=+385.601619266" lastFinishedPulling="2026-04-17 16:58:57.105781136 +0000 UTC m=+388.014508214" observedRunningTime="2026-04-17 16:58:57.733501132 +0000 UTC m=+388.642228225" watchObservedRunningTime="2026-04-17 16:58:57.735171468 +0000 UTC m=+388.643898563" Apr 17 16:59:08.715459 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:08.715433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-2c9lg" Apr 17 16:59:11.166031 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.165998 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx"] Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166324 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="pull" Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166338 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="pull" Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166348 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="util" Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166353 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="util" Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166363 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="extract" Apr 17 16:59:11.166373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166369 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="extract" Apr 17 16:59:11.166551 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.166409 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdc892cd-8406-40be-9f1e-aa2c0cc31e15" containerName="extract" Apr 17 16:59:11.169832 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.169806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.173208 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.173188 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:59:11.173335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.173213 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:59:11.174397 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.174364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:59:11.178180 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.178157 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx"] Apr 17 16:59:11.267988 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.267962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.268108 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.267995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.268108 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.268020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sm2g\" (UniqueName: \"kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.369244 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.369218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.369357 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.369250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.369357 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.369270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sm2g\" (UniqueName: \"kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.369618 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.369581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.369661 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.369633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.391225 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.391206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sm2g\" (UniqueName: \"kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.480401 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.480339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:11.595123 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.595094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx"] Apr 17 16:59:11.598269 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:11.598231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7e99be_15ee_4f86_81d7_cf2fcfa5d4c1.slice/crio-96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e WatchSource:0}: Error finding container 96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e: Status 404 returned error can't find the container with id 96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e Apr 17 16:59:11.626002 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.625981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-b48g4"] Apr 17 16:59:11.629172 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.629155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.631933 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.631911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 16:59:11.632059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.631931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:59:11.632059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.631956 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:59:11.632059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.631977 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 16:59:11.632059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.631917 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-zsb4c\"" Apr 17 16:59:11.640050 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.640032 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-b48g4"] Apr 17 16:59:11.752471 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.752408 2575 generic.go:358] "Generic (PLEG): container finished" podID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerID="6212ce04d9a3ac09372082f763a38c61b7ed3984e289c97ac4118f0f5b7de1e5" exitCode=0 Apr 17 16:59:11.752471 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.752443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" event={"ID":"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1","Type":"ContainerDied","Data":"6212ce04d9a3ac09372082f763a38c61b7ed3984e289c97ac4118f0f5b7de1e5"} Apr 17 16:59:11.752471 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.752465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" event={"ID":"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1","Type":"ContainerStarted","Data":"96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e"} Apr 17 16:59:11.772909 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.772886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2hv\" (UniqueName: \"kubernetes.io/projected/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-kube-api-access-pr2hv\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.773013 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.772925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tls-certs\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.773013 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.772958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tmp\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.874195 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.874167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tls-certs\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.874304 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.874201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tmp\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.874304 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.874247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2hv\" (UniqueName: \"kubernetes.io/projected/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-kube-api-access-pr2hv\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.876363 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.876341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tmp\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.876587 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.876569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-tls-certs\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.881866 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.881843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2hv\" (UniqueName: \"kubernetes.io/projected/66a8ef8d-fd7c-4622-8ba7-c0b43472f01d-kube-api-access-pr2hv\") pod \"kube-auth-proxy-764cf74f-b48g4\" (UID: \"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:11.942687 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:11.942666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" Apr 17 16:59:12.052425 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:12.052402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-b48g4"] Apr 17 16:59:12.054246 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:12.054221 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66a8ef8d_fd7c_4622_8ba7_c0b43472f01d.slice/crio-2e160acd2d2709aced80fadaf45f479a89b0d456ba304f8e749a648e48779b09 WatchSource:0}: Error finding container 2e160acd2d2709aced80fadaf45f479a89b0d456ba304f8e749a648e48779b09: Status 404 returned error can't find the container with id 2e160acd2d2709aced80fadaf45f479a89b0d456ba304f8e749a648e48779b09 Apr 17 16:59:12.758077 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:12.757998 2575 generic.go:358] "Generic (PLEG): container finished" podID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerID="47b54111a837fdf75fc7f332ef093ccfc38af32f47faaa4e0a835ee34c74c8c7" exitCode=0 Apr 17 16:59:12.758569 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:12.758112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" event={"ID":"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1","Type":"ContainerDied","Data":"47b54111a837fdf75fc7f332ef093ccfc38af32f47faaa4e0a835ee34c74c8c7"} Apr 17 16:59:12.759351 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:12.759324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" event={"ID":"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d","Type":"ContainerStarted","Data":"2e160acd2d2709aced80fadaf45f479a89b0d456ba304f8e749a648e48779b09"} Apr 17 16:59:13.767400 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:13.767359 2575 generic.go:358] "Generic (PLEG): container finished" podID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerID="ef0b8e990ae810bf4c0bbb7faac2705536f99aefa4d6820b5c15491ffd3b3cc2" exitCode=0 Apr 17 16:59:13.767845 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:13.767444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" event={"ID":"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1","Type":"ContainerDied","Data":"ef0b8e990ae810bf4c0bbb7faac2705536f99aefa4d6820b5c15491ffd3b3cc2"} Apr 17 16:59:14.443511 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.443478 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-t275s"] Apr 17 16:59:14.446995 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.446973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.450501 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.450477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 16:59:14.450626 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.450482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-99ck6\"" Apr 17 16:59:14.458630 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.458609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-t275s"] Apr 17 16:59:14.598306 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.598274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.598476 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.598333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v9s\" (UniqueName: \"kubernetes.io/projected/8fcf4e20-1636-4dcb-8347-e21b93185bba-kube-api-access-l8v9s\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.699623 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.699526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v9s\" (UniqueName: \"kubernetes.io/projected/8fcf4e20-1636-4dcb-8347-e21b93185bba-kube-api-access-l8v9s\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.699623 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.699604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.699869 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:14.699698 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 16:59:14.699869 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:14.699761 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert podName:8fcf4e20-1636-4dcb-8347-e21b93185bba nodeName:}" failed. No retries permitted until 2026-04-17 16:59:15.19974013 +0000 UTC m=+406.108467208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert") pod "odh-model-controller-858dbf95b8-t275s" (UID: "8fcf4e20-1636-4dcb-8347-e21b93185bba") : secret "odh-model-controller-webhook-cert" not found Apr 17 16:59:14.708951 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.708922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v9s\" (UniqueName: \"kubernetes.io/projected/8fcf4e20-1636-4dcb-8347-e21b93185bba-kube-api-access-l8v9s\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:14.988252 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:14.988231 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:15.102518 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.102496 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sm2g\" (UniqueName: \"kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g\") pod \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " Apr 17 16:59:15.102647 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.102539 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util\") pod \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " Apr 17 16:59:15.102647 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.102577 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle\") pod \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\" (UID: \"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1\") " Apr 17 16:59:15.103695 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.103671 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle" (OuterVolumeSpecName: "bundle") pod "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" (UID: "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:59:15.104411 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.104390 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g" (OuterVolumeSpecName: "kube-api-access-7sm2g") pod "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" (UID: "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1"). InnerVolumeSpecName "kube-api-access-7sm2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:59:15.111037 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.111015 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util" (OuterVolumeSpecName: "util") pod "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" (UID: "0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:59:15.203388 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.203352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:15.203494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.203439 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sm2g\" (UniqueName: \"kubernetes.io/projected/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-kube-api-access-7sm2g\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:15.203494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.203453 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-util\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:15.203494 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.203462 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1-bundle\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:15.205489 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.205472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fcf4e20-1636-4dcb-8347-e21b93185bba-cert\") pod \"odh-model-controller-858dbf95b8-t275s\" (UID: \"8fcf4e20-1636-4dcb-8347-e21b93185bba\") " pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:15.359401 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.359375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:15.471744 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.471720 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-t275s"] Apr 17 16:59:15.473760 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:15.473735 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fcf4e20_1636_4dcb_8347_e21b93185bba.slice/crio-e8748e8ecc23acce02a4bfa2f5d340d090e4c3c8a2ffa8793811eb3c1bddab77 WatchSource:0}: Error finding container e8748e8ecc23acce02a4bfa2f5d340d090e4c3c8a2ffa8793811eb3c1bddab77: Status 404 returned error can't find the container with id e8748e8ecc23acce02a4bfa2f5d340d090e4c3c8a2ffa8793811eb3c1bddab77 Apr 17 16:59:15.775827 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.775756 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" Apr 17 16:59:15.775981 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.775755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wcsjx" event={"ID":"0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1","Type":"ContainerDied","Data":"96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e"} Apr 17 16:59:15.775981 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.775861 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96811fb3475342410b3e4204f5d2fd161d87631b794b1088397bf3f16a4a416e" Apr 17 16:59:15.777190 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.777162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" event={"ID":"66a8ef8d-fd7c-4622-8ba7-c0b43472f01d","Type":"ContainerStarted","Data":"0bca5208a82533ff1e1f7692a78ac457ed7a33d6688db326899241d8c090f14b"} Apr 17 16:59:15.778402 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.778381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" event={"ID":"8fcf4e20-1636-4dcb-8347-e21b93185bba","Type":"ContainerStarted","Data":"e8748e8ecc23acce02a4bfa2f5d340d090e4c3c8a2ffa8793811eb3c1bddab77"} Apr 17 16:59:15.800335 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:15.800296 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-764cf74f-b48g4" podStartSLOduration=1.821981363 podStartE2EDuration="4.800283581s" podCreationTimestamp="2026-04-17 16:59:11 +0000 UTC" firstStartedPulling="2026-04-17 16:59:12.055972361 +0000 UTC m=+402.964699431" lastFinishedPulling="2026-04-17 16:59:15.034274575 +0000 UTC m=+405.943001649" observedRunningTime="2026-04-17 16:59:15.799479347 +0000 UTC m=+406.708206441" watchObservedRunningTime="2026-04-17 16:59:15.800283581 +0000 UTC m=+406.709010697" Apr 17 16:59:18.790239 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:18.790208 2575 generic.go:358] "Generic (PLEG): container finished" podID="8fcf4e20-1636-4dcb-8347-e21b93185bba" containerID="a1d633a4b549fbd3c9f5e498e76096cc116581ae3751fed9a340071c1a75b804" exitCode=1 Apr 17 16:59:18.790572 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:18.790270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" event={"ID":"8fcf4e20-1636-4dcb-8347-e21b93185bba","Type":"ContainerDied","Data":"a1d633a4b549fbd3c9f5e498e76096cc116581ae3751fed9a340071c1a75b804"} Apr 17 16:59:18.790572 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:18.790450 2575 scope.go:117] "RemoveContainer" containerID="a1d633a4b549fbd3c9f5e498e76096cc116581ae3751fed9a340071c1a75b804" Apr 17 16:59:19.794605 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:19.794550 2575 generic.go:358] "Generic (PLEG): container finished" podID="8fcf4e20-1636-4dcb-8347-e21b93185bba" containerID="9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51" exitCode=1 Apr 17 16:59:19.794605 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:19.794607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" event={"ID":"8fcf4e20-1636-4dcb-8347-e21b93185bba","Type":"ContainerDied","Data":"9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51"} Apr 17 16:59:19.795070 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:19.794641 2575 scope.go:117] "RemoveContainer" containerID="a1d633a4b549fbd3c9f5e498e76096cc116581ae3751fed9a340071c1a75b804" Apr 17 16:59:19.795070 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:19.794823 2575 scope.go:117] "RemoveContainer" containerID="9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51" Apr 17 16:59:19.795070 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:19.794992 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-t275s_opendatahub(8fcf4e20-1636-4dcb-8347-e21b93185bba)\"" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" podUID="8fcf4e20-1636-4dcb-8347-e21b93185bba" Apr 17 16:59:20.320418 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320384 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk"] Apr 17 16:59:20.320676 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320664 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="extract" Apr 17 16:59:20.320722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="extract" Apr 17 16:59:20.320722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320688 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="util" Apr 17 16:59:20.320722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320693 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="util" Apr 17 16:59:20.320722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320703 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="pull" Apr 17 16:59:20.320722 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320709 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="pull" Apr 17 16:59:20.320866 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.320755 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7e99be-15ee-4f86-81d7-cf2fcfa5d4c1" containerName="extract" Apr 17 16:59:20.324957 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.324935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.328340 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.328323 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:59:20.329485 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.329470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:59:20.329563 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.329549 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:59:20.343251 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.343229 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk"] Apr 17 16:59:20.442479 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.442447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.442663 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.442488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.442663 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.442561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqd92\" (UniqueName: \"kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.543789 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.543753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.543928 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.543809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.543928 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.543873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqd92\" (UniqueName: \"kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.544164 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.544142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.544164 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.544158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.581696 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.581635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqd92\" (UniqueName: \"kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.633479 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.633456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:20.766974 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.766950 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk"] Apr 17 16:59:20.768929 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:20.768906 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3fc9c_8f5b_43a6_a37e_596034e5fabb.slice/crio-2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd WatchSource:0}: Error finding container 2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd: Status 404 returned error can't find the container with id 2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd Apr 17 16:59:20.798454 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.798428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" event={"ID":"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb","Type":"ContainerStarted","Data":"2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd"} Apr 17 16:59:20.800219 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:20.800201 2575 scope.go:117] "RemoveContainer" containerID="9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51" Apr 17 16:59:20.800419 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:20.800401 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-t275s_opendatahub(8fcf4e20-1636-4dcb-8347-e21b93185bba)\"" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" podUID="8fcf4e20-1636-4dcb-8347-e21b93185bba" Apr 17 16:59:21.803919 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:21.803883 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerID="75d1e3c8a2c6525d0021dc5254701abd7cacf72f3b7745c311b2d86c4db17e5d" exitCode=0 Apr 17 16:59:21.803919 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:21.803919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" event={"ID":"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb","Type":"ContainerDied","Data":"75d1e3c8a2c6525d0021dc5254701abd7cacf72f3b7745c311b2d86c4db17e5d"} Apr 17 16:59:22.022423 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.022353 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7d55q"] Apr 17 16:59:22.025403 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.025388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.028152 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.028133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 16:59:22.028236 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.028218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-7kgsz\"" Apr 17 16:59:22.034802 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.034782 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7d55q"] Apr 17 16:59:22.155905 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.155875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.155905 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.155908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cm2t\" (UniqueName: \"kubernetes.io/projected/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-kube-api-access-7cm2t\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.256241 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.256207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.256241 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.256240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cm2t\" (UniqueName: \"kubernetes.io/projected/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-kube-api-access-7cm2t\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.256408 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:22.256354 2575 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 16:59:22.256462 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:22.256415 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert podName:b919bc69-0ed5-4ad4-a4d1-817ce3f80916 nodeName:}" failed. No retries permitted until 2026-04-17 16:59:22.7563981 +0000 UTC m=+413.665125172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert") pod "kserve-controller-manager-856948b99f-7d55q" (UID: "b919bc69-0ed5-4ad4-a4d1-817ce3f80916") : secret "kserve-webhook-server-cert" not found Apr 17 16:59:22.277403 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.277345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cm2t\" (UniqueName: \"kubernetes.io/projected/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-kube-api-access-7cm2t\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.761638 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.761585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.763895 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.763877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b919bc69-0ed5-4ad4-a4d1-817ce3f80916-cert\") pod \"kserve-controller-manager-856948b99f-7d55q\" (UID: \"b919bc69-0ed5-4ad4-a4d1-817ce3f80916\") " pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:22.935490 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:22.935456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:23.060548 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.060515 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7d55q"] Apr 17 16:59:23.061650 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:23.061622 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb919bc69_0ed5_4ad4_a4d1_817ce3f80916.slice/crio-1f05271316734563253c8fb2b0819a02612b61d5922da4a56aee4a3b9bb48d3f WatchSource:0}: Error finding container 1f05271316734563253c8fb2b0819a02612b61d5922da4a56aee4a3b9bb48d3f: Status 404 returned error can't find the container with id 1f05271316734563253c8fb2b0819a02612b61d5922da4a56aee4a3b9bb48d3f Apr 17 16:59:23.146527 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.146503 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl"] Apr 17 16:59:23.150817 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.150802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.153368 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.153346 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-z55n9\"" Apr 17 16:59:23.153457 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.153371 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 16:59:23.153457 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.153444 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 16:59:23.163330 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.163302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl"] Apr 17 16:59:23.266309 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.266277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/db35f413-97a7-42e4-9a49-0dc171e024b1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.266464 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.266323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvls8\" (UniqueName: \"kubernetes.io/projected/db35f413-97a7-42e4-9a49-0dc171e024b1-kube-api-access-wvls8\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.367564 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.367491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/db35f413-97a7-42e4-9a49-0dc171e024b1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.367564 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.367538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvls8\" (UniqueName: \"kubernetes.io/projected/db35f413-97a7-42e4-9a49-0dc171e024b1-kube-api-access-wvls8\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.370159 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.370134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/db35f413-97a7-42e4-9a49-0dc171e024b1-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.378249 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.378225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvls8\" (UniqueName: \"kubernetes.io/projected/db35f413-97a7-42e4-9a49-0dc171e024b1-kube-api-access-wvls8\") pod \"servicemesh-operator3-55f49c5f94-sp7kl\" (UID: \"db35f413-97a7-42e4-9a49-0dc171e024b1\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.465996 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.465974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:23.584582 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.584561 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl"] Apr 17 16:59:23.586256 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:23.586223 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb35f413_97a7_42e4_9a49_0dc171e024b1.slice/crio-9c106d24b776deaa902cfcbe14d6ff688e755a3431ce9742096777ae0e0ae823 WatchSource:0}: Error finding container 9c106d24b776deaa902cfcbe14d6ff688e755a3431ce9742096777ae0e0ae823: Status 404 returned error can't find the container with id 9c106d24b776deaa902cfcbe14d6ff688e755a3431ce9742096777ae0e0ae823 Apr 17 16:59:23.812253 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.812211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" event={"ID":"b919bc69-0ed5-4ad4-a4d1-817ce3f80916","Type":"ContainerStarted","Data":"1f05271316734563253c8fb2b0819a02612b61d5922da4a56aee4a3b9bb48d3f"} Apr 17 16:59:23.813442 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:23.813415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" event={"ID":"db35f413-97a7-42e4-9a49-0dc171e024b1","Type":"ContainerStarted","Data":"9c106d24b776deaa902cfcbe14d6ff688e755a3431ce9742096777ae0e0ae823"} Apr 17 16:59:25.359828 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:25.359793 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:25.360256 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:25.360176 2575 scope.go:117] "RemoveContainer" containerID="9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51" Apr 17 16:59:25.360358 ip-10-0-132-199 kubenswrapper[2575]: E0417 16:59:25.360338 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-t275s_opendatahub(8fcf4e20-1636-4dcb-8347-e21b93185bba)\"" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" podUID="8fcf4e20-1636-4dcb-8347-e21b93185bba" Apr 17 16:59:26.825214 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.825179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" event={"ID":"db35f413-97a7-42e4-9a49-0dc171e024b1","Type":"ContainerStarted","Data":"79993d0480a8ee662a97c5ca8f017fedc69362e5a7b0dbbd5295568443a07f82"} Apr 17 16:59:26.825650 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.825226 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:26.826526 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.826508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" event={"ID":"b919bc69-0ed5-4ad4-a4d1-817ce3f80916","Type":"ContainerStarted","Data":"87339f57030f74f03fa7988dd16b959bf79970f5698f1cd2f24820880fe00bf7"} Apr 17 16:59:26.826631 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.826619 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:26.845873 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.845827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" podStartSLOduration=1.1419362180000001 podStartE2EDuration="3.845812998s" podCreationTimestamp="2026-04-17 16:59:23 +0000 UTC" firstStartedPulling="2026-04-17 16:59:23.589354646 +0000 UTC m=+414.498081717" lastFinishedPulling="2026-04-17 16:59:26.293231426 +0000 UTC m=+417.201958497" observedRunningTime="2026-04-17 16:59:26.843773831 +0000 UTC m=+417.752500923" watchObservedRunningTime="2026-04-17 16:59:26.845812998 +0000 UTC m=+417.754540091" Apr 17 16:59:26.860174 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:26.860136 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" podStartSLOduration=2.633588769 podStartE2EDuration="5.860124156s" podCreationTimestamp="2026-04-17 16:59:21 +0000 UTC" firstStartedPulling="2026-04-17 16:59:23.06296727 +0000 UTC m=+413.971694341" lastFinishedPulling="2026-04-17 16:59:26.289502639 +0000 UTC m=+417.198229728" observedRunningTime="2026-04-17 16:59:26.858491703 +0000 UTC m=+417.767218798" watchObservedRunningTime="2026-04-17 16:59:26.860124156 +0000 UTC m=+417.768851248" Apr 17 16:59:35.359849 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:35.359817 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:35.360313 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:35.360173 2575 scope.go:117] "RemoveContainer" containerID="9f3c2dfd70acdc32a59612d8a1f091ffe4f0d316b32964b29125a53cbc871d51" Apr 17 16:59:35.857049 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:35.857013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" event={"ID":"8fcf4e20-1636-4dcb-8347-e21b93185bba","Type":"ContainerStarted","Data":"2e83f105c6683ac1d02fc338741c0cc355ef2adb7c82a5fbf481c5372266c1c0"} Apr 17 16:59:35.857225 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:35.857205 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:35.879622 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:35.879562 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" podStartSLOduration=1.716087554 podStartE2EDuration="21.87955004s" podCreationTimestamp="2026-04-17 16:59:14 +0000 UTC" firstStartedPulling="2026-04-17 16:59:15.474972282 +0000 UTC m=+406.383699354" lastFinishedPulling="2026-04-17 16:59:35.638434766 +0000 UTC m=+426.547161840" observedRunningTime="2026-04-17 16:59:35.877140831 +0000 UTC m=+426.785867926" watchObservedRunningTime="2026-04-17 16:59:35.87955004 +0000 UTC m=+426.788277133" Apr 17 16:59:36.862166 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:36.862129 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerID="9a8970ea5fe92bdbf69e7dc3aafb6d769dce116ee022b9600c8ff1f46b4d7bd5" exitCode=0 Apr 17 16:59:36.862578 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:36.862213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" event={"ID":"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb","Type":"ContainerDied","Data":"9a8970ea5fe92bdbf69e7dc3aafb6d769dce116ee022b9600c8ff1f46b4d7bd5"} Apr 17 16:59:37.832244 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:37.832214 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sp7kl" Apr 17 16:59:37.868483 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:37.868451 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerID="b195590d63eb519e21aabc73d17e14b6c804bb1fffd09dc44d3ac4fa1d98933b" exitCode=0 Apr 17 16:59:37.868918 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:37.868546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" event={"ID":"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb","Type":"ContainerDied","Data":"b195590d63eb519e21aabc73d17e14b6c804bb1fffd09dc44d3ac4fa1d98933b"} Apr 17 16:59:38.994005 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:38.993982 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:39.192575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.192496 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle\") pod \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " Apr 17 16:59:39.192575 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.192535 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util\") pod \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " Apr 17 16:59:39.192773 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.192587 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqd92\" (UniqueName: \"kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92\") pod \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\" (UID: \"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb\") " Apr 17 16:59:39.193473 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.193451 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle" (OuterVolumeSpecName: "bundle") pod "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" (UID: "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:59:39.194806 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.194776 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92" (OuterVolumeSpecName: "kube-api-access-mqd92") pod "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" (UID: "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb"). InnerVolumeSpecName "kube-api-access-mqd92". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:59:39.197452 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.197427 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util" (OuterVolumeSpecName: "util") pod "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" (UID: "e1f3fc9c-8f5b-43a6-a37e-596034e5fabb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:59:39.293822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.293782 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqd92\" (UniqueName: \"kubernetes.io/projected/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-kube-api-access-mqd92\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:39.293822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.293812 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-bundle\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:39.293822 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.293825 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3fc9c-8f5b-43a6-a37e-596034e5fabb-util\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 16:59:39.877378 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.877279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" event={"ID":"e1f3fc9c-8f5b-43a6-a37e-596034e5fabb","Type":"ContainerDied","Data":"2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd"} Apr 17 16:59:39.877378 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.877323 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d4584dfe9d20f0c42445aeed7b2cb9a659e454562902d8cd069a946a775b8dd" Apr 17 16:59:39.877378 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:39.877285 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2c9xlk" Apr 17 16:59:46.864542 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:46.864514 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-t275s" Apr 17 16:59:47.735680 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.735641 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq"] Apr 17 16:59:47.736059 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736042 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="util" Apr 17 16:59:47.736125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736063 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="util" Apr 17 16:59:47.736125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736079 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="pull" Apr 17 16:59:47.736125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736088 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="pull" Apr 17 16:59:47.736125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736103 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="extract" Apr 17 16:59:47.736125 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736112 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="extract" Apr 17 16:59:47.736356 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.736189 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1f3fc9c-8f5b-43a6-a37e-596034e5fabb" containerName="extract" Apr 17 16:59:47.744289 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.744267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.745405 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.745382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq"] Apr 17 16:59:47.746992 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.746947 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:59:47.746992 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.746947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 16:59:47.747176 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.747042 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-9b76v\"" Apr 17 16:59:47.747176 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.746958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 16:59:47.747275 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.747196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 16:59:47.754496 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754634 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/149e5c65-51bb-4de9-8c6d-fa881c826d3d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754634 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754634 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754634 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754812 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.754812 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.754668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cbb\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-kube-api-access-f5cbb\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.854962 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.854927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/149e5c65-51bb-4de9-8c6d-fa881c826d3d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.854971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.854998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.855019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.855040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.855062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5cbb\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-kube-api-access-f5cbb\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855128 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.855127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.855700 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.855673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.857081 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.857061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/149e5c65-51bb-4de9-8c6d-fa881c826d3d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.857389 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.857368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.857467 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.857454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.857612 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.857573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.863117 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.863094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:47.863391 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:47.863372 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5cbb\" (UniqueName: \"kubernetes.io/projected/149e5c65-51bb-4de9-8c6d-fa881c826d3d-kube-api-access-f5cbb\") pod \"istiod-openshift-gateway-55ff986f96-bqjlq\" (UID: \"149e5c65-51bb-4de9-8c6d-fa881c826d3d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:48.055546 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:48.055518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:48.185314 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:48.185285 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq"] Apr 17 16:59:48.186856 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:48.186830 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149e5c65_51bb_4de9_8c6d_fa881c826d3d.slice/crio-1ec7d317e3e04e28cd2d0b6521f75d013f3984000466d9be054a1142c3cd2e4b WatchSource:0}: Error finding container 1ec7d317e3e04e28cd2d0b6521f75d013f3984000466d9be054a1142c3cd2e4b: Status 404 returned error can't find the container with id 1ec7d317e3e04e28cd2d0b6521f75d013f3984000466d9be054a1142c3cd2e4b Apr 17 16:59:48.905517 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:48.905485 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" event={"ID":"149e5c65-51bb-4de9-8c6d-fa881c826d3d","Type":"ContainerStarted","Data":"1ec7d317e3e04e28cd2d0b6521f75d013f3984000466d9be054a1142c3cd2e4b"} Apr 17 16:59:50.753849 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:50.753800 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:59:50.754194 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:50.753874 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:59:50.913803 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:50.913762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" event={"ID":"149e5c65-51bb-4de9-8c6d-fa881c826d3d","Type":"ContainerStarted","Data":"3a8aa2bb7824c9ce5c8b3b074cdbdb3cd0dc53a73543120c238c5f8e1136a3c3"} Apr 17 16:59:50.913993 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:50.913850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:50.942885 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:50.942835 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" podStartSLOduration=1.378559233 podStartE2EDuration="3.942819737s" podCreationTimestamp="2026-04-17 16:59:47 +0000 UTC" firstStartedPulling="2026-04-17 16:59:48.189310354 +0000 UTC m=+439.098037428" lastFinishedPulling="2026-04-17 16:59:50.75357085 +0000 UTC m=+441.662297932" observedRunningTime="2026-04-17 16:59:50.942758117 +0000 UTC m=+441.851485211" watchObservedRunningTime="2026-04-17 16:59:50.942819737 +0000 UTC m=+441.851546830" Apr 17 16:59:51.920010 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:51.919974 2575 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-bqjlq container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 16:59:51.920438 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:51.920039 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" podUID="149e5c65-51bb-4de9-8c6d-fa881c826d3d" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:59:52.217038 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.216301 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr"] Apr 17 16:59:52.224915 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.224873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.227485 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.227440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-jqc2c\"" Apr 17 16:59:52.232336 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.232309 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr"] Apr 17 16:59:52.289936 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.289903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lbl\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-kube-api-access-66lbl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290122 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.289972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290122 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.289991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290122 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290122 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290292 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.290399 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.290301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.390951 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.390920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391115 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.390961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391115 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.390996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391115 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391115 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391115 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66lbl\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-kube-api-access-66lbl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391373 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391519 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391570 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391570 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391823 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.391981 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.391959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.393446 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.393424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.393634 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.393610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.400180 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.400158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.401144 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.401116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lbl\" (UniqueName: \"kubernetes.io/projected/2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e-kube-api-access-66lbl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr\" (UID: \"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.537738 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.537708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:52.658688 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.658660 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr"] Apr 17 16:59:52.662479 ip-10-0-132-199 kubenswrapper[2575]: W0417 16:59:52.662447 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf4061c_9fda_4fe4_bb1b_95f3a9d17f5e.slice/crio-c7038ec819771d1160d9a3f1b7185eedb5432c07d117df15439c77e83ad96f53 WatchSource:0}: Error finding container c7038ec819771d1160d9a3f1b7185eedb5432c07d117df15439c77e83ad96f53: Status 404 returned error can't find the container with id c7038ec819771d1160d9a3f1b7185eedb5432c07d117df15439c77e83ad96f53 Apr 17 16:59:52.925758 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:52.925669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" event={"ID":"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e","Type":"ContainerStarted","Data":"c7038ec819771d1160d9a3f1b7185eedb5432c07d117df15439c77e83ad96f53"} Apr 17 16:59:54.848673 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.848630 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:59:54.848911 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.848723 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:59:54.848911 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.848757 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:59:54.919326 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.919306 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bqjlq" Apr 17 16:59:54.934363 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.934333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" event={"ID":"2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e","Type":"ContainerStarted","Data":"aac6911d30976ae4dac1739a3812944a1e9450e008e312746841ed7243e69867"} Apr 17 16:59:54.957567 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:54.957504 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" podStartSLOduration=0.773590372 podStartE2EDuration="2.957484655s" podCreationTimestamp="2026-04-17 16:59:52 +0000 UTC" firstStartedPulling="2026-04-17 16:59:52.664462721 +0000 UTC m=+443.573189792" lastFinishedPulling="2026-04-17 16:59:54.848357 +0000 UTC m=+445.757084075" observedRunningTime="2026-04-17 16:59:54.956485707 +0000 UTC m=+445.865212800" watchObservedRunningTime="2026-04-17 16:59:54.957484655 +0000 UTC m=+445.866211749" Apr 17 16:59:55.538704 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:55.538667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:55.539926 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:55.539888 2575 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.26:15021/healthz/ready\": dial tcp 10.132.0.26:15021: connect: connection refused" start-of-body= Apr 17 16:59:55.540055 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:55.539961 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" podUID="2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.26:15021/healthz/ready\": dial tcp 10.132.0.26:15021: connect: connection refused" Apr 17 16:59:56.538698 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:56.538665 2575 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.26:15021/healthz/ready\": dial tcp 10.132.0.26:15021: connect: connection refused" start-of-body= Apr 17 16:59:56.539033 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:56.538723 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" podUID="2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.26:15021/healthz/ready\": dial tcp 10.132.0.26:15021: connect: connection refused" Apr 17 16:59:57.542357 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:57.542330 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:57.834274 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:57.834197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-7d55q" Apr 17 16:59:57.944114 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:57.944084 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 16:59:57.945062 ip-10-0-132-199 kubenswrapper[2575]: I0417 16:59:57.945042 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr" Apr 17 17:00:44.967418 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.967315 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2"] Apr 17 17:00:44.970541 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.970526 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:44.973985 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.973957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:00:44.974109 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.974041 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:00:44.975222 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.975208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-7kr24\"" Apr 17 17:00:44.979705 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:44.979680 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2"] Apr 17 17:00:45.015299 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:45.015268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw985\" (UniqueName: \"kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985\") pod \"limitador-operator-controller-manager-85c4996f8c-d88v2\" (UID: \"60f48352-4114-418a-af0c-718cade06231\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:45.116335 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:45.116307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw985\" (UniqueName: \"kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985\") pod \"limitador-operator-controller-manager-85c4996f8c-d88v2\" (UID: \"60f48352-4114-418a-af0c-718cade06231\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:45.125691 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:45.125660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw985\" (UniqueName: \"kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985\") pod \"limitador-operator-controller-manager-85c4996f8c-d88v2\" (UID: \"60f48352-4114-418a-af0c-718cade06231\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:45.280892 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:45.280802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:45.404997 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:45.404974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2"] Apr 17 17:00:45.407188 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:00:45.407163 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f48352_4114_418a_af0c_718cade06231.slice/crio-c63b38d7171653b44f3a76b1ab21bb9d281d5d9212cf1744def924fbb9a312ca WatchSource:0}: Error finding container c63b38d7171653b44f3a76b1ab21bb9d281d5d9212cf1744def924fbb9a312ca: Status 404 returned error can't find the container with id c63b38d7171653b44f3a76b1ab21bb9d281d5d9212cf1744def924fbb9a312ca Apr 17 17:00:46.101380 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:46.101346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" event={"ID":"60f48352-4114-418a-af0c-718cade06231","Type":"ContainerStarted","Data":"c63b38d7171653b44f3a76b1ab21bb9d281d5d9212cf1744def924fbb9a312ca"} Apr 17 17:00:48.110151 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:48.110118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" event={"ID":"60f48352-4114-418a-af0c-718cade06231","Type":"ContainerStarted","Data":"b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02"} Apr 17 17:00:48.110513 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:48.110224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:00:48.126645 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:48.126557 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" podStartSLOduration=2.220420436 podStartE2EDuration="4.126540617s" podCreationTimestamp="2026-04-17 17:00:44 +0000 UTC" firstStartedPulling="2026-04-17 17:00:45.409050093 +0000 UTC m=+496.317777165" lastFinishedPulling="2026-04-17 17:00:47.315170254 +0000 UTC m=+498.223897346" observedRunningTime="2026-04-17 17:00:48.125908197 +0000 UTC m=+499.034635291" watchObservedRunningTime="2026-04-17 17:00:48.126540617 +0000 UTC m=+499.035267709" Apr 17 17:00:59.116198 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:00:59.116166 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:01:00.898813 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.898779 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2"] Apr 17 17:01:00.899244 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.899069 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" podUID="60f48352-4114-418a-af0c-718cade06231" containerName="manager" containerID="cri-o://b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02" gracePeriod=2 Apr 17 17:01:00.911376 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.911343 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2"] Apr 17 17:01:00.941733 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.941704 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl"] Apr 17 17:01:00.942020 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.942007 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60f48352-4114-418a-af0c-718cade06231" containerName="manager" Apr 17 17:01:00.942063 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.942023 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f48352-4114-418a-af0c-718cade06231" containerName="manager" Apr 17 17:01:00.942096 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.942071 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="60f48352-4114-418a-af0c-718cade06231" containerName="manager" Apr 17 17:01:00.945294 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.945276 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:00.955492 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.955459 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl"] Apr 17 17:01:00.973025 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:00.972990 2575 status_manager.go:895] "Failed to get status for pod" podUID="60f48352-4114-418a-af0c-718cade06231" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" err="pods \"limitador-operator-controller-manager-85c4996f8c-d88v2\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" Apr 17 17:01:01.038005 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.037961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9cdv\" (UniqueName: \"kubernetes.io/projected/79dfc96d-036c-4f6a-b2fa-973d3a912d06-kube-api-access-p9cdv\") pod \"limitador-operator-controller-manager-85c4996f8c-mvjnl\" (UID: \"79dfc96d-036c-4f6a-b2fa-973d3a912d06\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:01.132242 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.132221 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:01:01.135461 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.135436 2575 status_manager.go:895] "Failed to get status for pod" podUID="60f48352-4114-418a-af0c-718cade06231" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" err="pods \"limitador-operator-controller-manager-85c4996f8c-d88v2\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" Apr 17 17:01:01.138855 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.138833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9cdv\" (UniqueName: \"kubernetes.io/projected/79dfc96d-036c-4f6a-b2fa-973d3a912d06-kube-api-access-p9cdv\") pod \"limitador-operator-controller-manager-85c4996f8c-mvjnl\" (UID: \"79dfc96d-036c-4f6a-b2fa-973d3a912d06\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:01.153153 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.153080 2575 generic.go:358] "Generic (PLEG): container finished" podID="60f48352-4114-418a-af0c-718cade06231" containerID="b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02" exitCode=0 Apr 17 17:01:01.153153 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.153128 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" Apr 17 17:01:01.153298 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.153174 2575 scope.go:117] "RemoveContainer" containerID="b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02" Apr 17 17:01:01.154141 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.154073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9cdv\" (UniqueName: \"kubernetes.io/projected/79dfc96d-036c-4f6a-b2fa-973d3a912d06-kube-api-access-p9cdv\") pod \"limitador-operator-controller-manager-85c4996f8c-mvjnl\" (UID: \"79dfc96d-036c-4f6a-b2fa-973d3a912d06\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:01.155925 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.155896 2575 status_manager.go:895] "Failed to get status for pod" podUID="60f48352-4114-418a-af0c-718cade06231" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" err="pods \"limitador-operator-controller-manager-85c4996f8c-d88v2\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" Apr 17 17:01:01.163584 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.163567 2575 scope.go:117] "RemoveContainer" containerID="b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02" Apr 17 17:01:01.163874 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:01:01.163856 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02\": container with ID starting with b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02 not found: ID does not exist" containerID="b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02" Apr 17 17:01:01.163923 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.163884 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02"} err="failed to get container status \"b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02\": rpc error: code = NotFound desc = could not find container \"b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02\": container with ID starting with b9e127373094aa719174db332c15af6aa34897bf681f6c6e882b4457b5feac02 not found: ID does not exist" Apr 17 17:01:01.239287 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.239254 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw985\" (UniqueName: \"kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985\") pod \"60f48352-4114-418a-af0c-718cade06231\" (UID: \"60f48352-4114-418a-af0c-718cade06231\") " Apr 17 17:01:01.241406 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.241378 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985" (OuterVolumeSpecName: "kube-api-access-qw985") pod "60f48352-4114-418a-af0c-718cade06231" (UID: "60f48352-4114-418a-af0c-718cade06231"). InnerVolumeSpecName "kube-api-access-qw985". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:01.269222 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.269191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:01.339962 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.339926 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qw985\" (UniqueName: \"kubernetes.io/projected/60f48352-4114-418a-af0c-718cade06231-kube-api-access-qw985\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 17:01:01.396845 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.396820 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl"] Apr 17 17:01:01.398519 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:01:01.398493 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79dfc96d_036c_4f6a_b2fa_973d3a912d06.slice/crio-c4b3b7ef7f9241387fc55dd4ba27b5f993b806708cdf6ab479e44b48264f69db WatchSource:0}: Error finding container c4b3b7ef7f9241387fc55dd4ba27b5f993b806708cdf6ab479e44b48264f69db: Status 404 returned error can't find the container with id c4b3b7ef7f9241387fc55dd4ba27b5f993b806708cdf6ab479e44b48264f69db Apr 17 17:01:01.466696 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.466668 2575 status_manager.go:895] "Failed to get status for pod" podUID="60f48352-4114-418a-af0c-718cade06231" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d88v2" err="pods \"limitador-operator-controller-manager-85c4996f8c-d88v2\" is forbidden: User \"system:node:ip-10-0-132-199.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-199.ec2.internal' and this object" Apr 17 17:01:01.589535 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:01.589501 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f48352-4114-418a-af0c-718cade06231" path="/var/lib/kubelet/pods/60f48352-4114-418a-af0c-718cade06231/volumes" Apr 17 17:01:02.158859 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:02.158816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" event={"ID":"79dfc96d-036c-4f6a-b2fa-973d3a912d06","Type":"ContainerStarted","Data":"2371d775045e3d287e80f897beb497bdc8b28b5769857e0f0ab191abdbec67a9"} Apr 17 17:01:02.159240 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:02.158864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" event={"ID":"79dfc96d-036c-4f6a-b2fa-973d3a912d06","Type":"ContainerStarted","Data":"c4b3b7ef7f9241387fc55dd4ba27b5f993b806708cdf6ab479e44b48264f69db"} Apr 17 17:01:02.159240 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:02.158927 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:02.177798 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:02.177751 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" podStartSLOduration=2.177737313 podStartE2EDuration="2.177737313s" podCreationTimestamp="2026-04-17 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:02.17593189 +0000 UTC m=+513.084658984" watchObservedRunningTime="2026-04-17 17:01:02.177737313 +0000 UTC m=+513.086464406" Apr 17 17:01:13.163996 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:13.163962 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-mvjnl" Apr 17 17:01:29.385563 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.385529 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr"] Apr 17 17:01:29.391911 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.391882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.394767 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.394732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-z75pl\"" Apr 17 17:01:29.403497 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.403404 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr"] Apr 17 17:01:29.461157 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461283 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461283 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461283 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461408 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwss\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-kube-api-access-wmwss\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461408 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461408 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461408 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.461408 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.461394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562360 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562360 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562585 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562585 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562585 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562585 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562814 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwss\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-kube-api-access-wmwss\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562814 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562814 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.562959 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.562826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.563021 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.563001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.563169 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.563146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.563369 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.563344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.563733 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.563708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.565714 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.565687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.566038 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.566016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.578724 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.578699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.583606 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.583574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwss\" (UniqueName: \"kubernetes.io/projected/45da7715-7e3e-4b70-b147-98ed5ee08f7e-kube-api-access-wmwss\") pod \"maas-default-gateway-openshift-default-58b6f876-kvxxr\" (UID: \"45da7715-7e3e-4b70-b147-98ed5ee08f7e\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.710206 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.710120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-z75pl\"" Apr 17 17:01:29.717942 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.717912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:29.847011 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.846987 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr"] Apr 17 17:01:29.849283 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:01:29.849253 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45da7715_7e3e_4b70_b147_98ed5ee08f7e.slice/crio-7283339471c2ac575c238ef846a908696a7f372d7b067ef2139ea76fd682101f WatchSource:0}: Error finding container 7283339471c2ac575c238ef846a908696a7f372d7b067ef2139ea76fd682101f: Status 404 returned error can't find the container with id 7283339471c2ac575c238ef846a908696a7f372d7b067ef2139ea76fd682101f Apr 17 17:01:29.851496 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.851463 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:01:29.851670 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.851537 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:01:29.851670 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:29.851578 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:01:30.251406 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:30.251371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" event={"ID":"45da7715-7e3e-4b70-b147-98ed5ee08f7e","Type":"ContainerStarted","Data":"237ff8c26b4c8e3e2d840873a6a7b32248966ec9a7acaf43a70a16d89f5e2d30"} Apr 17 17:01:30.251406 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:30.251408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" event={"ID":"45da7715-7e3e-4b70-b147-98ed5ee08f7e","Type":"ContainerStarted","Data":"7283339471c2ac575c238ef846a908696a7f372d7b067ef2139ea76fd682101f"} Apr 17 17:01:30.273527 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:30.273482 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" podStartSLOduration=1.273468021 podStartE2EDuration="1.273468021s" podCreationTimestamp="2026-04-17 17:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:30.27087301 +0000 UTC m=+541.179600114" watchObservedRunningTime="2026-04-17 17:01:30.273468021 +0000 UTC m=+541.182195113" Apr 17 17:01:30.718788 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:30.718751 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:30.723507 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:30.723485 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:31.254609 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:31.254559 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:01:31.255434 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:01:31.255415 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-kvxxr" Apr 17 17:02:19.283257 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.283221 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:19.286869 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.286852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:19.289562 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.289539 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 17:02:19.289562 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.289552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-7npr8\"" Apr 17 17:02:19.296176 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.296154 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:19.350941 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.350905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgzxk\" (UniqueName: \"kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk\") pod \"maas-controller-67d9545685-j4xpz\" (UID: \"da5eb050-e9cf-4425-85ed-6686459a5571\") " pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:19.451874 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.451843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgzxk\" (UniqueName: \"kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk\") pod \"maas-controller-67d9545685-j4xpz\" (UID: \"da5eb050-e9cf-4425-85ed-6686459a5571\") " pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:19.460371 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.460348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgzxk\" (UniqueName: \"kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk\") pod \"maas-controller-67d9545685-j4xpz\" (UID: \"da5eb050-e9cf-4425-85ed-6686459a5571\") " pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:19.598328 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.598303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:19.720874 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:19.720797 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:19.723490 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:02:19.723459 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda5eb050_e9cf_4425_85ed_6686459a5571.slice/crio-0f77e526ccfe2f3fe42f5c3641a548d8af85bf3c6bd50d53aee861e2488d425f WatchSource:0}: Error finding container 0f77e526ccfe2f3fe42f5c3641a548d8af85bf3c6bd50d53aee861e2488d425f: Status 404 returned error can't find the container with id 0f77e526ccfe2f3fe42f5c3641a548d8af85bf3c6bd50d53aee861e2488d425f Apr 17 17:02:20.154148 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.154109 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-9658c6bf9-6fkr2"] Apr 17 17:02:20.160001 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.159977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.162758 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.162739 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gmxlg\"" Apr 17 17:02:20.162933 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.162891 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 17:02:20.166666 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.166646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9658c6bf9-6fkr2"] Apr 17 17:02:20.260290 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.260263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgblb\" (UniqueName: \"kubernetes.io/projected/e814a4fd-ac96-48aa-b929-3e628e56307d-kube-api-access-sgblb\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.260441 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.260298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e814a4fd-ac96-48aa-b929-3e628e56307d-maas-api-tls\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.361302 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.361213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgblb\" (UniqueName: \"kubernetes.io/projected/e814a4fd-ac96-48aa-b929-3e628e56307d-kube-api-access-sgblb\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.361756 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.361337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e814a4fd-ac96-48aa-b929-3e628e56307d-maas-api-tls\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.364353 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.364326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e814a4fd-ac96-48aa-b929-3e628e56307d-maas-api-tls\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.369496 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.369475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgblb\" (UniqueName: \"kubernetes.io/projected/e814a4fd-ac96-48aa-b929-3e628e56307d-kube-api-access-sgblb\") pod \"maas-api-9658c6bf9-6fkr2\" (UID: \"e814a4fd-ac96-48aa-b929-3e628e56307d\") " pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.422468 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.422388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67d9545685-j4xpz" event={"ID":"da5eb050-e9cf-4425-85ed-6686459a5571","Type":"ContainerStarted","Data":"0f77e526ccfe2f3fe42f5c3641a548d8af85bf3c6bd50d53aee861e2488d425f"} Apr 17 17:02:20.471622 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.471571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:20.647413 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:20.647373 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9658c6bf9-6fkr2"] Apr 17 17:02:21.430809 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:21.430751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9658c6bf9-6fkr2" event={"ID":"e814a4fd-ac96-48aa-b929-3e628e56307d","Type":"ContainerStarted","Data":"4ba4b679e1559a59db729f067ad3fff2c1872969f8a9af8db6ea5f563e95d979"} Apr 17 17:02:23.464435 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.464336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9658c6bf9-6fkr2" event={"ID":"e814a4fd-ac96-48aa-b929-3e628e56307d","Type":"ContainerStarted","Data":"0c638b7e3d63488facce6321e1e6a6649838f4f291db7aeb7ff1ee35ffd94db9"} Apr 17 17:02:23.464877 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.464446 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:23.465763 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.465740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67d9545685-j4xpz" event={"ID":"da5eb050-e9cf-4425-85ed-6686459a5571","Type":"ContainerStarted","Data":"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f"} Apr 17 17:02:23.465883 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.465837 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:23.486976 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.486938 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-9658c6bf9-6fkr2" podStartSLOduration=0.943203776 podStartE2EDuration="3.486926854s" podCreationTimestamp="2026-04-17 17:02:20 +0000 UTC" firstStartedPulling="2026-04-17 17:02:20.666738124 +0000 UTC m=+591.575465205" lastFinishedPulling="2026-04-17 17:02:23.210461209 +0000 UTC m=+594.119188283" observedRunningTime="2026-04-17 17:02:23.484943955 +0000 UTC m=+594.393671049" watchObservedRunningTime="2026-04-17 17:02:23.486926854 +0000 UTC m=+594.395653979" Apr 17 17:02:23.502022 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:23.501978 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-67d9545685-j4xpz" podStartSLOduration=1.018952786 podStartE2EDuration="4.501962843s" podCreationTimestamp="2026-04-17 17:02:19 +0000 UTC" firstStartedPulling="2026-04-17 17:02:19.724790251 +0000 UTC m=+590.633517322" lastFinishedPulling="2026-04-17 17:02:23.207800308 +0000 UTC m=+594.116527379" observedRunningTime="2026-04-17 17:02:23.501580833 +0000 UTC m=+594.410307937" watchObservedRunningTime="2026-04-17 17:02:23.501962843 +0000 UTC m=+594.410689936" Apr 17 17:02:29.475506 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:29.475468 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-9658c6bf9-6fkr2" Apr 17 17:02:29.511043 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:29.511014 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 17:02:29.511334 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:29.511317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 17:02:34.475883 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.475854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:34.771403 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.771331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-65c4f594b8-n5fjs"] Apr 17 17:02:34.775522 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.775504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:34.780657 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.780633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-65c4f594b8-n5fjs"] Apr 17 17:02:34.878232 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.878191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsd7x\" (UniqueName: \"kubernetes.io/projected/45ed88fc-2554-4821-aadc-c8e1f435e4e3-kube-api-access-lsd7x\") pod \"maas-controller-65c4f594b8-n5fjs\" (UID: \"45ed88fc-2554-4821-aadc-c8e1f435e4e3\") " pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:34.979229 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.979195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsd7x\" (UniqueName: \"kubernetes.io/projected/45ed88fc-2554-4821-aadc-c8e1f435e4e3-kube-api-access-lsd7x\") pod \"maas-controller-65c4f594b8-n5fjs\" (UID: \"45ed88fc-2554-4821-aadc-c8e1f435e4e3\") " pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:34.987699 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:34.987668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsd7x\" (UniqueName: \"kubernetes.io/projected/45ed88fc-2554-4821-aadc-c8e1f435e4e3-kube-api-access-lsd7x\") pod \"maas-controller-65c4f594b8-n5fjs\" (UID: \"45ed88fc-2554-4821-aadc-c8e1f435e4e3\") " pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:35.087824 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:35.087794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:35.203888 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:35.203863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-65c4f594b8-n5fjs"] Apr 17 17:02:35.205653 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:02:35.205629 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ed88fc_2554_4821_aadc_c8e1f435e4e3.slice/crio-2d07a82a89d7c43fb4675cd2d45e97bb28afb46f57656bcccab495955428cf45 WatchSource:0}: Error finding container 2d07a82a89d7c43fb4675cd2d45e97bb28afb46f57656bcccab495955428cf45: Status 404 returned error can't find the container with id 2d07a82a89d7c43fb4675cd2d45e97bb28afb46f57656bcccab495955428cf45 Apr 17 17:02:35.506967 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:35.506925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" event={"ID":"45ed88fc-2554-4821-aadc-c8e1f435e4e3","Type":"ContainerStarted","Data":"2d07a82a89d7c43fb4675cd2d45e97bb28afb46f57656bcccab495955428cf45"} Apr 17 17:02:36.512045 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:36.512011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" event={"ID":"45ed88fc-2554-4821-aadc-c8e1f435e4e3","Type":"ContainerStarted","Data":"d725a4838dd7abdb2e456d4797c9be583046fb52fa4b5756d8f84ba186185724"} Apr 17 17:02:36.512421 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:36.512123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:36.530764 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:36.530719 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" podStartSLOduration=2.248699757 podStartE2EDuration="2.530705955s" podCreationTimestamp="2026-04-17 17:02:34 +0000 UTC" firstStartedPulling="2026-04-17 17:02:35.206967645 +0000 UTC m=+606.115694716" lastFinishedPulling="2026-04-17 17:02:35.48897383 +0000 UTC m=+606.397700914" observedRunningTime="2026-04-17 17:02:36.529155598 +0000 UTC m=+607.437882686" watchObservedRunningTime="2026-04-17 17:02:36.530705955 +0000 UTC m=+607.439433049" Apr 17 17:02:47.520919 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.520842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-65c4f594b8-n5fjs" Apr 17 17:02:47.558293 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.558266 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:47.558518 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.558481 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-67d9545685-j4xpz" podUID="da5eb050-e9cf-4425-85ed-6686459a5571" containerName="manager" containerID="cri-o://b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f" gracePeriod=10 Apr 17 17:02:47.802332 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.802307 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:47.893270 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.893239 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgzxk\" (UniqueName: \"kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk\") pod \"da5eb050-e9cf-4425-85ed-6686459a5571\" (UID: \"da5eb050-e9cf-4425-85ed-6686459a5571\") " Apr 17 17:02:47.895367 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.895344 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk" (OuterVolumeSpecName: "kube-api-access-xgzxk") pod "da5eb050-e9cf-4425-85ed-6686459a5571" (UID: "da5eb050-e9cf-4425-85ed-6686459a5571"). InnerVolumeSpecName "kube-api-access-xgzxk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:47.993808 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:47.993769 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xgzxk\" (UniqueName: \"kubernetes.io/projected/da5eb050-e9cf-4425-85ed-6686459a5571-kube-api-access-xgzxk\") on node \"ip-10-0-132-199.ec2.internal\" DevicePath \"\"" Apr 17 17:02:48.553429 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.553393 2575 generic.go:358] "Generic (PLEG): container finished" podID="da5eb050-e9cf-4425-85ed-6686459a5571" containerID="b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f" exitCode=0 Apr 17 17:02:48.553867 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.553459 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67d9545685-j4xpz" Apr 17 17:02:48.553867 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.553456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67d9545685-j4xpz" event={"ID":"da5eb050-e9cf-4425-85ed-6686459a5571","Type":"ContainerDied","Data":"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f"} Apr 17 17:02:48.553867 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.553557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67d9545685-j4xpz" event={"ID":"da5eb050-e9cf-4425-85ed-6686459a5571","Type":"ContainerDied","Data":"0f77e526ccfe2f3fe42f5c3641a548d8af85bf3c6bd50d53aee861e2488d425f"} Apr 17 17:02:48.553867 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.553573 2575 scope.go:117] "RemoveContainer" containerID="b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f" Apr 17 17:02:48.562104 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.561933 2575 scope.go:117] "RemoveContainer" containerID="b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f" Apr 17 17:02:48.562208 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:02:48.562188 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f\": container with ID starting with b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f not found: ID does not exist" containerID="b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f" Apr 17 17:02:48.562249 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.562218 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f"} err="failed to get container status \"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f\": rpc error: code = NotFound desc = could not find container \"b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f\": container with ID starting with b1e9594cddda435f8ed67bd4c53a6c4f16695781bf2e4649391b36b79fa01b0f not found: ID does not exist" Apr 17 17:02:48.574163 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.574139 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:48.577475 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:48.577458 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-67d9545685-j4xpz"] Apr 17 17:02:49.589834 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:02:49.589803 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5eb050-e9cf-4425-85ed-6686459a5571" path="/var/lib/kubelet/pods/da5eb050-e9cf-4425-85ed-6686459a5571/volumes" Apr 17 17:03:12.076990 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.076953 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4"] Apr 17 17:03:12.077422 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.077407 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da5eb050-e9cf-4425-85ed-6686459a5571" containerName="manager" Apr 17 17:03:12.077473 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.077425 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5eb050-e9cf-4425-85ed-6686459a5571" containerName="manager" Apr 17 17:03:12.077535 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.077525 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="da5eb050-e9cf-4425-85ed-6686459a5571" containerName="manager" Apr 17 17:03:12.080910 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.080891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.083635 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.083611 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:03:12.083635 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.083630 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:03:12.084773 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.084752 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-9bqcd\"" Apr 17 17:03:12.084854 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.084833 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 17:03:12.089569 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.089548 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4"] Apr 17 17:03:12.182001 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.181972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.182158 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.182012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.182158 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.182069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tbf\" (UniqueName: \"kubernetes.io/projected/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kube-api-access-26tbf\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.182158 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.182101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.182274 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.182187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.182274 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.182217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283062 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283196 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283196 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26tbf\" (UniqueName: \"kubernetes.io/projected/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kube-api-access-26tbf\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283196 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283196 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283346 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283506 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283546 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.283611 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.283574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.285477 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.285459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.285575 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.285559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.290919 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.290896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tbf\" (UniqueName: \"kubernetes.io/projected/1ebec94c-48ff-4ee7-bf97-62f270b0ff2d-kube-api-access-26tbf\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-qd9n4\" (UID: \"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.391914 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.391831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:12.521170 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.521148 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4"] Apr 17 17:03:12.522771 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:03:12.522747 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ebec94c_48ff_4ee7_bf97_62f270b0ff2d.slice/crio-f84825fad9ed09df9fc6aea1100436e270a1838e3f7ee9e795648a6d153dc326 WatchSource:0}: Error finding container f84825fad9ed09df9fc6aea1100436e270a1838e3f7ee9e795648a6d153dc326: Status 404 returned error can't find the container with id f84825fad9ed09df9fc6aea1100436e270a1838e3f7ee9e795648a6d153dc326 Apr 17 17:03:12.524839 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.524822 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:03:12.636257 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:12.636164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerStarted","Data":"f84825fad9ed09df9fc6aea1100436e270a1838e3f7ee9e795648a6d153dc326"} Apr 17 17:03:18.663551 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:18.663507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerStarted","Data":"cbd469b6cf1ccdf91a67f2a7723f64a2bacc0e8efb148ae9aa6932071f924285"} Apr 17 17:03:23.681250 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:23.681214 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="cbd469b6cf1ccdf91a67f2a7723f64a2bacc0e8efb148ae9aa6932071f924285" exitCode=0 Apr 17 17:03:23.681636 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:23.681290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"cbd469b6cf1ccdf91a67f2a7723f64a2bacc0e8efb148ae9aa6932071f924285"} Apr 17 17:03:25.690436 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:25.690408 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/0.log" Apr 17 17:03:25.690877 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:25.690767 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="874bfa0b745d1c30d357aba1d667426351930c3296566b336f920f849c39e734" exitCode=2 Apr 17 17:03:25.690877 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:25.690840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"874bfa0b745d1c30d357aba1d667426351930c3296566b336f920f849c39e734"} Apr 17 17:03:25.691200 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:25.691186 2575 scope.go:117] "RemoveContainer" containerID="874bfa0b745d1c30d357aba1d667426351930c3296566b336f920f849c39e734" Apr 17 17:03:26.695479 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.695452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/1.log" Apr 17 17:03:26.695888 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.695797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/0.log" Apr 17 17:03:26.696117 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.696095 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec" exitCode=2 Apr 17 17:03:26.696177 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.696161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec"} Apr 17 17:03:26.696215 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.696201 2575 scope.go:117] "RemoveContainer" containerID="874bfa0b745d1c30d357aba1d667426351930c3296566b336f920f849c39e734" Apr 17 17:03:26.696697 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:26.696673 2575 scope.go:117] "RemoveContainer" containerID="e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec" Apr 17 17:03:26.696895 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:26.696879 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:03:27.700791 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:27.700766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/1.log" Apr 17 17:03:32.392651 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:32.392568 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:32.393119 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:32.392663 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:32.393199 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:32.393179 2575 scope.go:117] "RemoveContainer" containerID="e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec" Apr 17 17:03:32.393412 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:32.393389 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:03:40.182034 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.181999 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2"] Apr 17 17:03:40.502867 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.502789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2"] Apr 17 17:03:40.503004 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.502907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.505699 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.505676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 17:03:40.621529 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21d081e3-3d44-4028-a946-e566574f780b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.621706 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.621706 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.621706 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.621706 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.621840 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.621740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6v88\" (UniqueName: \"kubernetes.io/projected/21d081e3-3d44-4028-a946-e566574f780b-kube-api-access-g6v88\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.722900 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.722874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6v88\" (UniqueName: \"kubernetes.io/projected/21d081e3-3d44-4028-a946-e566574f780b-kube-api-access-g6v88\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723044 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.722916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21d081e3-3d44-4028-a946-e566574f780b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723044 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.722978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723044 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723044 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723248 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723433 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723433 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.723586 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.723571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.725211 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.725188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21d081e3-3d44-4028-a946-e566574f780b-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.725414 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.725393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21d081e3-3d44-4028-a946-e566574f780b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.730802 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.730781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6v88\" (UniqueName: \"kubernetes.io/projected/21d081e3-3d44-4028-a946-e566574f780b-kube-api-access-g6v88\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2\" (UID: \"21d081e3-3d44-4028-a946-e566574f780b\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.813499 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.813473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:40.941128 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:40.941096 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2"] Apr 17 17:03:40.943463 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:03:40.943435 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d081e3_3d44_4028_a946_e566574f780b.slice/crio-053705045dfe7665089f842d79481cd4d2af8a8074995be6d566ca37ffabeb60 WatchSource:0}: Error finding container 053705045dfe7665089f842d79481cd4d2af8a8074995be6d566ca37ffabeb60: Status 404 returned error can't find the container with id 053705045dfe7665089f842d79481cd4d2af8a8074995be6d566ca37ffabeb60 Apr 17 17:03:41.752605 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:41.752562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerStarted","Data":"68c2e3ec8db2b62042b6f4baeb635c6220261410008b612f359a548cd8a5f5d0"} Apr 17 17:03:41.752944 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:41.752622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerStarted","Data":"053705045dfe7665089f842d79481cd4d2af8a8074995be6d566ca37ffabeb60"} Apr 17 17:03:44.585612 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:44.585549 2575 scope.go:117] "RemoveContainer" containerID="e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec" Apr 17 17:03:45.078147 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.078113 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw"] Apr 17 17:03:45.080404 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.080383 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.083769 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.083750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 17:03:45.092124 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.092074 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw"] Apr 17 17:03:45.164440 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.164440 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.164643 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9bfeda-6173-4819-86ca-12721ab2af78-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.164643 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.164643 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crkx\" (UniqueName: \"kubernetes.io/projected/2c9bfeda-6173-4819-86ca-12721ab2af78-kube-api-access-9crkx\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.164749 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.164655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.266042 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9bfeda-6173-4819-86ca-12721ab2af78-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.266194 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.266800 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9crkx\" (UniqueName: \"kubernetes.io/projected/2c9bfeda-6173-4819-86ca-12721ab2af78-kube-api-access-9crkx\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.266800 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.266800 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.267017 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.266826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.267256 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.267182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.267256 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.267227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.267429 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.267346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.276024 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.275821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9bfeda-6173-4819-86ca-12721ab2af78-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.276129 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.276052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c9bfeda-6173-4819-86ca-12721ab2af78-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.278716 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.278664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crkx\" (UniqueName: \"kubernetes.io/projected/2c9bfeda-6173-4819-86ca-12721ab2af78-kube-api-access-9crkx\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-vmpjw\" (UID: \"2c9bfeda-6173-4819-86ca-12721ab2af78\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.389511 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.389427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:45.516254 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.516224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw"] Apr 17 17:03:45.516561 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:03:45.516536 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c9bfeda_6173_4819_86ca_12721ab2af78.slice/crio-77cea84a1e65b282b1d358b8a25a950ac9f0703e57b49ba1226321708ad09756 WatchSource:0}: Error finding container 77cea84a1e65b282b1d358b8a25a950ac9f0703e57b49ba1226321708ad09756: Status 404 returned error can't find the container with id 77cea84a1e65b282b1d358b8a25a950ac9f0703e57b49ba1226321708ad09756 Apr 17 17:03:45.767207 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.767133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/2.log" Apr 17 17:03:45.767611 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.767575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/1.log" Apr 17 17:03:45.767962 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.767938 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec" exitCode=2 Apr 17 17:03:45.768023 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.768008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec"} Apr 17 17:03:45.768097 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.768049 2575 scope.go:117] "RemoveContainer" containerID="e33ae07c65bda99694c37bf93d2ee2780819e469ea62b61133f18328368f88ec" Apr 17 17:03:45.768563 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.768529 2575 scope.go:117] "RemoveContainer" containerID="b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec" Apr 17 17:03:45.768797 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:45.768773 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:03:45.770001 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.769982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerStarted","Data":"3fa8d035a849a810368bd236611ed2a7ad771ba529a526d74af63981ee43b51c"} Apr 17 17:03:45.770110 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:45.770008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerStarted","Data":"77cea84a1e65b282b1d358b8a25a950ac9f0703e57b49ba1226321708ad09756"} Apr 17 17:03:46.774554 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:46.774469 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="68c2e3ec8db2b62042b6f4baeb635c6220261410008b612f359a548cd8a5f5d0" exitCode=0 Apr 17 17:03:46.775026 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:46.774551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"68c2e3ec8db2b62042b6f4baeb635c6220261410008b612f359a548cd8a5f5d0"} Apr 17 17:03:46.775954 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:46.775939 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/2.log" Apr 17 17:03:47.784743 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:47.784713 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/0.log" Apr 17 17:03:47.785222 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:47.785077 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="4b48a0723be98e4b036f926c9f01bcc369cb8b45448d368057a9e46e1f9fe731" exitCode=2 Apr 17 17:03:47.785222 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:47.785122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"4b48a0723be98e4b036f926c9f01bcc369cb8b45448d368057a9e46e1f9fe731"} Apr 17 17:03:47.785495 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:47.785479 2575 scope.go:117] "RemoveContainer" containerID="4b48a0723be98e4b036f926c9f01bcc369cb8b45448d368057a9e46e1f9fe731" Apr 17 17:03:48.790146 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.790122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/1.log" Apr 17 17:03:48.790560 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.790542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/0.log" Apr 17 17:03:48.790928 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.790902 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9" exitCode=2 Apr 17 17:03:48.790999 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.790977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9"} Apr 17 17:03:48.791054 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.791029 2575 scope.go:117] "RemoveContainer" containerID="4b48a0723be98e4b036f926c9f01bcc369cb8b45448d368057a9e46e1f9fe731" Apr 17 17:03:48.791546 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:48.791524 2575 scope.go:117] "RemoveContainer" containerID="6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9" Apr 17 17:03:48.791826 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:48.791801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:03:49.796076 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:49.796045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/1.log" Apr 17 17:03:50.813859 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:50.813821 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:50.813859 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:50.813864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:03:50.814397 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:50.814303 2575 scope.go:117] "RemoveContainer" containerID="6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9" Apr 17 17:03:50.814531 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:50.814512 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:03:52.392355 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:52.392326 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:52.392355 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:52.392355 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:03:52.392767 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:52.392754 2575 scope.go:117] "RemoveContainer" containerID="b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec" Apr 17 17:03:52.392948 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:52.392932 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:03:53.814938 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:53.814908 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="3fa8d035a849a810368bd236611ed2a7ad771ba529a526d74af63981ee43b51c" exitCode=0 Apr 17 17:03:53.815286 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:53.814982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"3fa8d035a849a810368bd236611ed2a7ad771ba529a526d74af63981ee43b51c"} Apr 17 17:03:54.820298 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:54.820265 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/0.log" Apr 17 17:03:54.820749 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:54.820624 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="96cabec20a53efb3fbcef3e642ef482bbf20253627f835ad6300a6c9dbc4d344" exitCode=2 Apr 17 17:03:54.820749 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:54.820627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"96cabec20a53efb3fbcef3e642ef482bbf20253627f835ad6300a6c9dbc4d344"} Apr 17 17:03:54.821023 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:54.821009 2575 scope.go:117] "RemoveContainer" containerID="96cabec20a53efb3fbcef3e642ef482bbf20253627f835ad6300a6c9dbc4d344" Apr 17 17:03:55.390461 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.390395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:55.390461 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.390426 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:03:55.825194 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.825166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/1.log" Apr 17 17:03:55.825660 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.825517 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/0.log" Apr 17 17:03:55.825879 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.825854 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f" exitCode=2 Apr 17 17:03:55.825930 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.825916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f"} Apr 17 17:03:55.826042 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.825958 2575 scope.go:117] "RemoveContainer" containerID="96cabec20a53efb3fbcef3e642ef482bbf20253627f835ad6300a6c9dbc4d344" Apr 17 17:03:55.826252 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:55.826230 2575 scope.go:117] "RemoveContainer" containerID="6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f" Apr 17 17:03:55.826506 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:55.826486 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:03:56.830707 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:56.830686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/1.log" Apr 17 17:03:56.831368 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:03:56.831351 2575 scope.go:117] "RemoveContainer" containerID="6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f" Apr 17 17:03:56.831543 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:03:56.831525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:03.586347 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:03.586316 2575 scope.go:117] "RemoveContainer" containerID="6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9" Apr 17 17:04:04.863031 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.863000 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/2.log" Apr 17 17:04:04.863411 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.863379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/1.log" Apr 17 17:04:04.863684 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.863662 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" exitCode=2 Apr 17 17:04:04.863755 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.863702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6"} Apr 17 17:04:04.863755 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.863730 2575 scope.go:117] "RemoveContainer" containerID="6b85ee13c4306691c363fe05e58a42cbfbe9a92aebe92eeacfd4990819f619a9" Apr 17 17:04:04.864196 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:04.864176 2575 scope.go:117] "RemoveContainer" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" Apr 17 17:04:04.864420 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:04.864401 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:05.389679 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:05.389647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:05.389679 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:05.389677 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:05.390085 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:05.390072 2575 scope.go:117] "RemoveContainer" containerID="6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f" Apr 17 17:04:05.585696 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:05.585670 2575 scope.go:117] "RemoveContainer" containerID="b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec" Apr 17 17:04:05.869108 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:05.869081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/2.log" Apr 17 17:04:06.875018 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.874992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/3.log" Apr 17 17:04:06.875507 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.875390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/2.log" Apr 17 17:04:06.875728 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.875708 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" exitCode=2 Apr 17 17:04:06.875804 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.875781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d"} Apr 17 17:04:06.875857 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.875834 2575 scope.go:117] "RemoveContainer" containerID="b0028b8236af04772fbcc74a7c68bc71d825fa8553883772c397b7c1debda3ec" Apr 17 17:04:06.876345 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.876325 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:06.876568 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:06.876548 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:04:06.877314 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.877296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/2.log" Apr 17 17:04:06.877683 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.877670 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/1.log" Apr 17 17:04:06.877988 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.877970 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838" exitCode=2 Apr 17 17:04:06.878090 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.878009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838"} Apr 17 17:04:06.878344 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.878328 2575 scope.go:117] "RemoveContainer" containerID="e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838" Apr 17 17:04:06.878511 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:06.878494 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:06.900806 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:06.900617 2575 scope.go:117] "RemoveContainer" containerID="6ac270ace58373d18871f0fbe01494b43745ed585a53c73a1e9f8808ba7b8b9f" Apr 17 17:04:07.882778 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:07.882753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/3.log" Apr 17 17:04:07.884363 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:07.884344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/2.log" Apr 17 17:04:10.814343 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:10.814310 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:04:10.814343 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:10.814348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:04:10.814767 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:10.814727 2575 scope.go:117] "RemoveContainer" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" Apr 17 17:04:10.814916 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:10.814899 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:12.392516 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:12.392479 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:04:12.392516 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:12.392518 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:04:12.393117 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:12.393093 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:12.393331 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:12.393310 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:04:15.390096 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:15.390066 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:15.390096 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:15.390096 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:15.390501 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:15.390487 2575 scope.go:117] "RemoveContainer" containerID="e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838" Apr 17 17:04:15.390697 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:15.390679 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:22.586248 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:22.586217 2575 scope.go:117] "RemoveContainer" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" Apr 17 17:04:22.586745 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:22.586390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:26.585976 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:26.585948 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:26.586331 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:26.586119 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:04:27.585481 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.585452 2575 scope.go:117] "RemoveContainer" containerID="e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838" Apr 17 17:04:27.959034 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.959014 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/3.log" Apr 17 17:04:27.959397 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.959383 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/2.log" Apr 17 17:04:27.959709 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.959689 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" exitCode=2 Apr 17 17:04:27.959757 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.959721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec"} Apr 17 17:04:27.959757 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.959747 2575 scope.go:117] "RemoveContainer" containerID="e28ea42e94164ea657e23f3e6abb9c9b7983bd7b607d3ea3bb8a00f645e53838" Apr 17 17:04:27.960118 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:27.960103 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:04:27.960307 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:27.960289 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:28.964729 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:28.964705 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/3.log" Apr 17 17:04:35.389744 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:35.389713 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:35.389744 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:35.389744 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:04:35.390146 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:35.390115 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:04:35.390289 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:35.390271 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:37.586067 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.586036 2575 scope.go:117] "RemoveContainer" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" Apr 17 17:04:37.995985 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.995959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/3.log" Apr 17 17:04:37.996370 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.996353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/2.log" Apr 17 17:04:37.996705 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.996686 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" exitCode=2 Apr 17 17:04:37.996779 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.996759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53"} Apr 17 17:04:37.996821 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.996803 2575 scope.go:117] "RemoveContainer" containerID="f956950786856a6ee72315f3ee5ee46819ef17c71d2a5953fafb2cf86ed3b2c6" Apr 17 17:04:37.997207 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:37.997191 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:04:37.997402 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:37.997386 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:39.001754 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:39.001726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/3.log" Apr 17 17:04:39.587668 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:39.587639 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:39.587851 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:39.587832 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:04:40.814096 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:40.814068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:04:40.814096 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:40.814102 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:04:40.814650 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:40.814588 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:04:40.814854 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:40.814834 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:46.585233 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:46.585205 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:04:46.585616 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:46.585373 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:04:53.586192 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:53.586152 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:53.586763 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:53.586263 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:04:53.586763 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:53.586430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:04:54.058585 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.058560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/4.log" Apr 17 17:04:54.058953 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.058938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/3.log" Apr 17 17:04:54.059266 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.059248 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" exitCode=2 Apr 17 17:04:54.059336 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.059318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a"} Apr 17 17:04:54.059373 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.059359 2575 scope.go:117] "RemoveContainer" containerID="47dcb5a0b393233334e80bbbc4f8efcd2f2c5710eceadeff42c2152167755d4d" Apr 17 17:04:54.059756 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:54.059736 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:04:54.059976 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:04:54.059960 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:04:55.063877 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:04:55.063851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/4.log" Apr 17 17:05:01.586305 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:01.586275 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:05:01.586687 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:01.586448 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:05:02.392381 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:02.392346 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:05:02.392381 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:02.392384 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:05:02.392807 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:02.392793 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:05:02.392997 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:02.392980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:05:04.585349 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:04.585320 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:05:04.585737 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:04.585510 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:05:12.585959 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:12.585929 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:05:16.585398 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:16.585367 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:05:16.585777 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:16.585477 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:05:16.585777 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:16.585539 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:05:16.585777 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:16.585656 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:05:18.146831 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.146803 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/4.log" Apr 17 17:05:18.147208 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.147181 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/3.log" Apr 17 17:05:18.147501 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.147480 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" exitCode=2 Apr 17 17:05:18.147577 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.147541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a"} Apr 17 17:05:18.147633 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.147579 2575 scope.go:117] "RemoveContainer" containerID="b66d32f04fbd79196df1c8347f08523516cab2dc63a79650a0f9e267cbd4aeec" Apr 17 17:05:18.147948 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:18.147933 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:05:18.148157 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:18.148128 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:05:19.151843 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:19.151817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/4.log" Apr 17 17:05:25.390576 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:25.390538 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:05:25.390576 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:25.390581 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:05:25.391132 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:25.390995 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:05:25.391180 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:25.391162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:05:29.592779 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:29.589465 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:05:29.592779 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:29.589832 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:05:30.585696 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:30.585668 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:05:31.194698 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.194632 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/4.log" Apr 17 17:05:31.195065 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.194976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/3.log" Apr 17 17:05:31.195284 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.195265 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" exitCode=2 Apr 17 17:05:31.195337 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.195322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918"} Apr 17 17:05:31.195373 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.195352 2575 scope.go:117] "RemoveContainer" containerID="88c0d54ec0af857f2df4444569a450ceff925c3847611e9dd2c395cd0d750e53" Apr 17 17:05:31.195768 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:31.195744 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:05:31.195998 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:31.195979 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:05:32.199701 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:32.199676 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/4.log" Apr 17 17:05:38.585513 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:38.585443 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:05:38.585886 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:38.585647 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:05:40.814302 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:40.814270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:05:40.814302 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:40.814308 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:05:40.814705 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:40.814687 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:05:40.814886 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:40.814868 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:05:43.586338 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:43.586298 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:05:43.586819 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:43.586535 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:05:52.585672 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:52.585641 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:05:52.586027 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:52.585793 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:05:52.586027 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:52.585823 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:05:52.586027 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:52.585960 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:05:58.585755 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:05:58.585728 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:05:58.586317 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:05:58.585958 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:06:04.585748 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:04.585717 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:06:04.586192 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:04.585946 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:06:06.586065 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:06.586037 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:06:06.586416 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:06.586217 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:06:10.586291 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:10.586261 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:06:10.586674 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:10.586429 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:06:18.586166 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:18.586135 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:06:18.586529 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:18.586313 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:06:21.591195 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:21.591163 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:06:21.591585 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:21.591404 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:06:25.585670 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:25.585641 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:06:26.394157 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.394130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/5.log" Apr 17 17:06:26.394521 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.394506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/4.log" Apr 17 17:06:26.394844 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.394823 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" exitCode=2 Apr 17 17:06:26.394934 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.394897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" event={"ID":"1ebec94c-48ff-4ee7-bf97-62f270b0ff2d","Type":"ContainerDied","Data":"3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7"} Apr 17 17:06:26.394976 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.394936 2575 scope.go:117] "RemoveContainer" containerID="af1beb4d42b822b2d450f424e95cc79b05abaf7c80b49e9a6d1835ee499b865a" Apr 17 17:06:26.395305 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:26.395293 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:06:26.395532 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:26.395513 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:06:27.404285 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:27.404244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/5.log" Apr 17 17:06:32.392191 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:32.392160 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:06:32.392191 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:32.392192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" Apr 17 17:06:32.392733 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:32.392576 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:06:32.392802 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:32.392784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:06:33.585641 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:33.585584 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:06:33.586104 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:33.585796 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:06:34.585697 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:34.585666 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:06:34.586055 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:34.585838 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:06:44.586353 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:44.586319 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:06:44.586769 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:44.586493 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:06:45.585793 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:45.585758 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:06:45.586006 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:45.585987 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:06:45.586154 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:45.586136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:06:46.478425 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.478400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/5.log" Apr 17 17:06:46.478796 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.478767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/4.log" Apr 17 17:06:46.479049 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.479030 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c9bfeda-6173-4819-86ca-12721ab2af78" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" exitCode=2 Apr 17 17:06:46.479115 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.479099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" event={"ID":"2c9bfeda-6173-4819-86ca-12721ab2af78","Type":"ContainerDied","Data":"635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3"} Apr 17 17:06:46.479160 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.479138 2575 scope.go:117] "RemoveContainer" containerID="d73208ddac7fd90a0591f1bf4ae316d6c3fbb63d30f27589d4e48f7b6488468a" Apr 17 17:06:46.479486 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:46.479472 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:06:46.479713 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:46.479696 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:06:47.483681 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:47.483648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/5.log" Apr 17 17:06:55.389763 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:55.389732 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:06:55.389763 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:55.389766 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" Apr 17 17:06:55.390308 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:55.390279 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:06:55.390513 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:06:55.390492 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:06:59.587487 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:06:59.587450 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:07:00.530234 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.530204 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/5.log" Apr 17 17:07:00.530648 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.530632 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/4.log" Apr 17 17:07:00.530945 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.530926 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d081e3-3d44-4028-a946-e566574f780b" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" exitCode=2 Apr 17 17:07:00.530995 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.530984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" event={"ID":"21d081e3-3d44-4028-a946-e566574f780b","Type":"ContainerDied","Data":"4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f"} Apr 17 17:07:00.531037 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.531015 2575 scope.go:117] "RemoveContainer" containerID="8303043098ac6ce1f386dd485612207d98851050ec5e72fdbc736bf3f5241918" Apr 17 17:07:00.531422 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.531407 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:00.531663 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:00.531640 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:00.585481 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.585457 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:07:00.585690 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:00.585672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:07:00.814288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.814252 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:07:00.814288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:00.814290 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" Apr 17 17:07:01.536417 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:01.536391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/5.log" Apr 17 17:07:01.537103 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:01.537085 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:01.537280 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:01.537263 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:06.586329 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:06.586300 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:07:06.586812 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:06.586465 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:07:11.586188 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:11.586096 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:07:11.586650 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:11.586343 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:07:14.585733 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:14.585705 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:14.586083 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:14.585878 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:19.588790 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:19.588759 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:07:19.589164 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:19.588978 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:07:23.585532 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:23.585498 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:07:23.585908 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:23.585742 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:07:28.585729 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:28.585701 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:28.586072 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:28.585878 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:29.521698 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.521667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/5.log" Apr 17 17:07:29.522362 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.522339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/5.log" Apr 17 17:07:29.522558 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.522544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/5.log" Apr 17 17:07:29.523005 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.522984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/5.log" Apr 17 17:07:29.523126 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.523111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/5.log" Apr 17 17:07:29.523813 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.523797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/5.log" Apr 17 17:07:29.538657 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.538641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 17:07:29.539483 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:29.539459 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 17:07:33.585911 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:33.585879 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:07:33.586274 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:33.586069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:07:37.586154 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:37.586117 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:07:37.586648 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:37.586366 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:07:41.585787 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:41.585757 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:41.586146 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:41.585942 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:45.585554 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:45.585519 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:07:45.585995 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:45.585725 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:07:50.585785 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:50.585748 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:07:50.586237 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:50.585976 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:07:55.518077 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:55.518048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7d55q_b919bc69-0ed5-4ad4-a4d1-817ce3f80916/manager/0.log" Apr 17 17:07:55.585553 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:55.585526 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:07:55.585756 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:55.585738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:07:55.632022 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:55.631985 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-9658c6bf9-6fkr2_e814a4fd-ac96-48aa-b929-3e628e56307d/maas-api/0.log" Apr 17 17:07:55.762414 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:55.762388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-65c4f594b8-n5fjs_45ed88fc-2554-4821-aadc-c8e1f435e4e3/manager/0.log" Apr 17 17:07:55.886057 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:55.886031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-t275s_8fcf4e20-1636-4dcb-8347-e21b93185bba/manager/2.log" Apr 17 17:07:56.007657 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:56.007634 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-2c9lg_4b261f84-2f33-4cba-bb82-2e8401d96c9c/manager/0.log" Apr 17 17:07:56.585211 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:56.585187 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:07:56.585559 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:07:56.585350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:07:58.484232 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:58.484205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-mvjnl_79dfc96d-036c-4f6a-b2fa-973d3a912d06/manager/0.log" Apr 17 17:07:58.835203 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:58.835177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr_2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e/istio-proxy/0.log" Apr 17 17:07:58.961531 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:58.961505 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bqjlq_149e5c65-51bb-4de9-8c6d-fa881c826d3d/discovery/0.log" Apr 17 17:07:59.070407 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.070385 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-764cf74f-b48g4_66a8ef8d-fd7c-4622-8ba7-c0b43472f01d/kube-auth-proxy/0.log" Apr 17 17:07:59.294735 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.294707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-kvxxr_45da7715-7e3e-4b70-b147-98ed5ee08f7e/istio-proxy/0.log" Apr 17 17:07:59.765177 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.765109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/storage-initializer/0.log" Apr 17 17:07:59.771104 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.771087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_21d081e3-3d44-4028-a946-e566574f780b/main/5.log" Apr 17 17:07:59.882394 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.882374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/storage-initializer/0.log" Apr 17 17:07:59.888449 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:07:59.888429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_1ebec94c-48ff-4ee7-bf97-62f270b0ff2d/main/5.log" Apr 17 17:08:00.372134 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:00.372103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/storage-initializer/0.log" Apr 17 17:08:00.382958 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:00.382937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_2c9bfeda-6173-4819-86ca-12721ab2af78/main/5.log" Apr 17 17:08:04.585933 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:04.585910 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:08:04.586277 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:04.586090 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:08:07.270534 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:07.270507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5pjdb_897a5f4a-9096-4195-a831-b7c123ac02f5/global-pull-secret-syncer/0.log" Apr 17 17:08:07.435423 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:07.435394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pckwb_3b961c80-2452-4eb9-ba7f-bd5743b6253f/konnectivity-agent/0.log" Apr 17 17:08:07.449399 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:07.449374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-199.ec2.internal_12719cc175836885556f66e7ce6f8019/haproxy/0.log" Apr 17 17:08:07.585446 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:07.585416 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:08:07.585646 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:07.585624 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:08:10.586362 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:10.586330 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:08:10.588751 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:10.586506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:08:11.507271 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:11.507241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-mvjnl_79dfc96d-036c-4f6a-b2fa-973d3a912d06/manager/0.log" Apr 17 17:08:13.252796 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:13.252767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frx24_f23f7a47-f779-461b-a627-6aa06ced398c/node-exporter/0.log" Apr 17 17:08:13.270058 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:13.270035 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frx24_f23f7a47-f779-461b-a627-6aa06ced398c/kube-rbac-proxy/0.log" Apr 17 17:08:13.298967 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:13.298949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-frx24_f23f7a47-f779-461b-a627-6aa06ced398c/init-textfile/0.log" Apr 17 17:08:15.814172 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.814130 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv"] Apr 17 17:08:15.817807 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.817785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.820545 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.820529 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"openshift-service-ca.crt\"" Apr 17 17:08:15.821797 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.821775 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbgn6\"/\"default-dockercfg-s7d7q\"" Apr 17 17:08:15.821881 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.821807 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"kube-root-ca.crt\"" Apr 17 17:08:15.829499 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.829479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv"] Apr 17 17:08:15.887806 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.887766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-proc\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.887953 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.887834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-sys\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.887953 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.887871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-lib-modules\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.887953 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.887908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlnhr\" (UniqueName: \"kubernetes.io/projected/f10f1fce-b84a-4e09-bfae-04929fa29743-kube-api-access-wlnhr\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.887953 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.887939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-podres\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989135 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-lib-modules\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlnhr\" (UniqueName: \"kubernetes.io/projected/f10f1fce-b84a-4e09-bfae-04929fa29743-kube-api-access-wlnhr\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-podres\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-proc\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-sys\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989288 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-lib-modules\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989468 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-proc\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989468 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-podres\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.989468 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.989327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f10f1fce-b84a-4e09-bfae-04929fa29743-sys\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:15.997061 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:15.997033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlnhr\" (UniqueName: \"kubernetes.io/projected/f10f1fce-b84a-4e09-bfae-04929fa29743-kube-api-access-wlnhr\") pod \"perf-node-gather-daemonset-ptwqv\" (UID: \"f10f1fce-b84a-4e09-bfae-04929fa29743\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:16.127986 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.127910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:16.252727 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.252703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv"] Apr 17 17:08:16.254303 ip-10-0-132-199 kubenswrapper[2575]: W0417 17:08:16.254275 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf10f1fce_b84a_4e09_bfae_04929fa29743.slice/crio-8a4f53f1e64dd28c5fc2056bfa3020ad797ea4ecb3402fd629beb18ca9be5ec4 WatchSource:0}: Error finding container 8a4f53f1e64dd28c5fc2056bfa3020ad797ea4ecb3402fd629beb18ca9be5ec4: Status 404 returned error can't find the container with id 8a4f53f1e64dd28c5fc2056bfa3020ad797ea4ecb3402fd629beb18ca9be5ec4 Apr 17 17:08:16.256162 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.256143 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:08:16.808906 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.808870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" event={"ID":"f10f1fce-b84a-4e09-bfae-04929fa29743","Type":"ContainerStarted","Data":"e9c851ae0a4240698b145008aa66dee01ee702e97fff42fb79455417652f0258"} Apr 17 17:08:16.808906 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.808909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" event={"ID":"f10f1fce-b84a-4e09-bfae-04929fa29743","Type":"ContainerStarted","Data":"8a4f53f1e64dd28c5fc2056bfa3020ad797ea4ecb3402fd629beb18ca9be5ec4"} Apr 17 17:08:16.809111 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.808965 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:16.830120 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:16.830082 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" podStartSLOduration=1.8300674030000001 podStartE2EDuration="1.830067403s" podCreationTimestamp="2026-04-17 17:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:16.829556426 +0000 UTC m=+947.738283519" watchObservedRunningTime="2026-04-17 17:08:16.830067403 +0000 UTC m=+947.738794498" Apr 17 17:08:17.456415 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:17.456389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhhcz_f0e1678f-0432-409a-8f70-dde3c3bb6e48/dns/0.log" Apr 17 17:08:17.471076 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:17.471051 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhhcz_f0e1678f-0432-409a-8f70-dde3c3bb6e48/kube-rbac-proxy/0.log" Apr 17 17:08:17.508665 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:17.508645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mbbjn_8e1eaec4-d649-4750-ab80-f763a4edde6a/dns-node-resolver/0.log" Apr 17 17:08:17.963629 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:17.963584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7br24_6e370d23-be74-44de-9ef0-318adb824e76/node-ca/0.log" Apr 17 17:08:18.708444 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:18.708405 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf5jzqr_2bf4061c-9fda-4fe4-bb1b-95f3a9d17f5e/istio-proxy/0.log" Apr 17 17:08:18.760300 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:18.760273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bqjlq_149e5c65-51bb-4de9-8c6d-fa881c826d3d/discovery/0.log" Apr 17 17:08:18.775637 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:18.775616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-764cf74f-b48g4_66a8ef8d-fd7c-4622-8ba7-c0b43472f01d/kube-auth-proxy/0.log" Apr 17 17:08:18.817695 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:18.817675 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-kvxxr_45da7715-7e3e-4b70-b147-98ed5ee08f7e/istio-proxy/0.log" Apr 17 17:08:19.396862 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:19.396836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-khd9q_e1156e13-0a5f-456f-9c4b-7491034e6fa0/serve-healthcheck-canary/0.log" Apr 17 17:08:19.587952 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:19.587931 2575 scope.go:117] "RemoveContainer" containerID="3f7070e9dd53f70ba66e1068c2436b7e91fa1f0bd17cd27d499a6994cd7fd0e7" Apr 17 17:08:19.588166 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:19.588145 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-qd9n4_llm(1ebec94c-48ff-4ee7-bf97-62f270b0ff2d)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-qd9n4" podUID="1ebec94c-48ff-4ee7-bf97-62f270b0ff2d" Apr 17 17:08:19.816306 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:19.816275 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7c82q_0ccf217d-8ec0-486b-806c-87d885bd71dd/kube-rbac-proxy/0.log" Apr 17 17:08:19.830870 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:19.830848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7c82q_0ccf217d-8ec0-486b-806c-87d885bd71dd/exporter/0.log" Apr 17 17:08:19.848420 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:19.848401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7c82q_0ccf217d-8ec0-486b-806c-87d885bd71dd/extractor/0.log" Apr 17 17:08:21.586145 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.586118 2575 scope.go:117] "RemoveContainer" containerID="635056506d165df925fba48d728631a8289c8b468a352d4966ac5a44dd8e8cc3" Apr 17 17:08:21.586494 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:21.586293 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-vmpjw_llm(2c9bfeda-6173-4819-86ca-12721ab2af78)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-vmpjw" podUID="2c9bfeda-6173-4819-86ca-12721ab2af78" Apr 17 17:08:21.681099 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.681073 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7d55q_b919bc69-0ed5-4ad4-a4d1-817ce3f80916/manager/0.log" Apr 17 17:08:21.696704 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.696677 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-9658c6bf9-6fkr2_e814a4fd-ac96-48aa-b929-3e628e56307d/maas-api/0.log" Apr 17 17:08:21.714358 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.714332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-65c4f594b8-n5fjs_45ed88fc-2554-4821-aadc-c8e1f435e4e3/manager/0.log" Apr 17 17:08:21.729716 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.729696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-t275s_8fcf4e20-1636-4dcb-8347-e21b93185bba/manager/1.log" Apr 17 17:08:21.738259 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.738239 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-t275s_8fcf4e20-1636-4dcb-8347-e21b93185bba/manager/2.log" Apr 17 17:08:21.761143 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:21.761123 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-2c9lg_4b261f84-2f33-4cba-bb82-2e8401d96c9c/manager/0.log" Apr 17 17:08:22.585648 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:22.585618 2575 scope.go:117] "RemoveContainer" containerID="4d409f95a7f4ed3595fdc7ce46a4b44cc64da4ddcf49357c749761a0112ca44f" Apr 17 17:08:22.585815 ip-10-0-132-199 kubenswrapper[2575]: E0417 17:08:22.585794 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2_llm(21d081e3-3d44-4028-a946-e566574f780b)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-tptx2" podUID="21d081e3-3d44-4028-a946-e566574f780b" Apr 17 17:08:22.822084 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:22.822049 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-ptwqv" Apr 17 17:08:28.319360 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.319325 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4wnr7_842dbf17-4840-4948-b464-a2890415d77a/kube-multus/0.log" Apr 17 17:08:28.342151 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.342128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/kube-multus-additional-cni-plugins/0.log" Apr 17 17:08:28.364243 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.364220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/egress-router-binary-copy/0.log" Apr 17 17:08:28.388185 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.388158 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/cni-plugins/0.log" Apr 17 17:08:28.407259 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.407243 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/bond-cni-plugin/0.log" Apr 17 17:08:28.426733 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.426717 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/routeoverride-cni/0.log" Apr 17 17:08:28.445236 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.445219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/whereabouts-cni-bincopy/0.log" Apr 17 17:08:28.462981 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.462962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78nlz_89263846-e4e9-4dd4-bdb0-dfa9ae5e8995/whereabouts-cni/0.log" Apr 17 17:08:28.868903 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.868876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pm56t_6064206c-e379-4668-9aa8-a2165341d497/network-metrics-daemon/0.log" Apr 17 17:08:28.888282 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:28.888259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pm56t_6064206c-e379-4668-9aa8-a2165341d497/kube-rbac-proxy/0.log" Apr 17 17:08:30.350181 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.350155 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-controller/0.log" Apr 17 17:08:30.363334 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.363307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/0.log" Apr 17 17:08:30.367353 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.367337 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovn-acl-logging/1.log" Apr 17 17:08:30.382130 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.382104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/kube-rbac-proxy-node/0.log" Apr 17 17:08:30.398814 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.398795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:08:30.413283 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.413263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/northd/0.log" Apr 17 17:08:30.428741 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.428705 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/nbdb/0.log" Apr 17 17:08:30.445229 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.445215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/sbdb/0.log" Apr 17 17:08:30.547173 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:30.547129 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8dsh_b2fc59c3-8d03-4d73-94bf-91312f60a7c5/ovnkube-controller/0.log" Apr 17 17:08:31.628535 ip-10-0-132-199 kubenswrapper[2575]: I0417 17:08:31.628505 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fhjz7_134235c0-0964-4070-b83c-8e7e912a6f98/network-check-target-container/0.log"