Apr 16 20:27:24.916201 ip-10-0-137-53 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:27:25.351714 ip-10-0-137-53 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:25.351714 ip-10-0-137-53 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:27:25.351714 ip-10-0-137-53 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:25.351714 ip-10-0-137-53 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:27:25.351714 ip-10-0-137-53 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:25.352845 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.352627 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:27:25.355659 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355644 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:25.355659 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355658 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355662 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355665 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355669 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355672 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355675 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355678 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355681 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355684 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355687 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355690 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355692 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355696 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355704 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355707 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355709 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355712 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355715 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355717 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355720 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:25.355725 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355723 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355726 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355728 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355731 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355734 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355737 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355740 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355743 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355745 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355748 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355750 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355753 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355756 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355758 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355761 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355765 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355768 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355771 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355774 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:25.356385 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355777 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355779 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355782 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355784 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355787 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355791 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355794 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355797 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355800 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355802 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355806 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355809 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355812 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355814 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355816 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355820 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355823 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355825 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355828 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:25.357028 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355830 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355833 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355835 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355838 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355841 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355843 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355846 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355848 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355851 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355853 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355855 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355858 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355861 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355863 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355866 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355868 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355871 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355873 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355876 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355880 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:25.357510 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355882 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355885 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355888 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355891 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355893 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355896 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.355898 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357792 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357803 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357806 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357810 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357813 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357816 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357819 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357821 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357824 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357827 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357830 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357832 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357835 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:25.358020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357838 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357840 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357843 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357846 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357848 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357851 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357857 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357860 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357862 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357865 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357868 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357870 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357873 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357876 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357879 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357881 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357884 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357887 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357889 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357892 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:25.358495 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357895 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357898 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357901 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357903 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357906 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357908 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357911 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357914 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357916 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357919 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357922 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357924 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357927 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357929 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357932 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357934 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357938 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357956 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357960 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:25.359038 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357965 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357967 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357970 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357973 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357977 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357981 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357984 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357988 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357991 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357994 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.357997 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358000 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358003 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358007 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358010 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358012 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358015 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358018 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358021 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358024 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:25.359508 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358027 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358029 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358032 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358035 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358038 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358047 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358050 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358052 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358055 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358058 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358060 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358063 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358066 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358068 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358139 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358146 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358153 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358157 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358166 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358170 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358175 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358179 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:27:25.360020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358182 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358185 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358189 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358192 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358196 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358199 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358202 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358204 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358207 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358210 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358213 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358220 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358223 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358227 2578 flags.go:64] FLAG: --config-dir="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358230 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358233 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358242 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358245 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358249 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358252 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358256 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358259 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358262 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358265 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358268 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:27:25.360551 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358272 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358275 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358278 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358280 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358284 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358287 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358294 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358297 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358300 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358303 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358306 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358310 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358313 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358317 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358320 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358323 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358325 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358328 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358331 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358334 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358337 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358339 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358343 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358346 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358355 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:27:25.361180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358358 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358361 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358364 2578 flags.go:64] FLAG: --help="false" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358367 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358370 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358373 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358376 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358379 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358382 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358385 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358388 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358391 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358394 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358397 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358400 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358403 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358406 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358409 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358412 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358415 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358418 2578 flags.go:64] FLAG: --lock-file="" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358421 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358424 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358426 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:27:25.361781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358432 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358434 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358437 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358440 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358443 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358446 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358449 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358452 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358464 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358467 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358471 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358474 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358477 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358480 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358483 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358486 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358489 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358492 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358499 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358503 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358506 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358509 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358512 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:27:25.362396 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358518 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358521 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358524 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358527 2578 flags.go:64] FLAG: --port="10250" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358530 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358533 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bd57462387f509a4" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358537 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358540 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358542 2578 flags.go:64] FLAG: --register-node="true" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358545 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358548 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358552 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358555 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358558 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358560 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358564 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358567 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358570 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358578 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358581 2578 flags.go:64] FLAG: --runonce="false" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358584 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358587 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358590 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358592 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358595 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358599 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:27:25.363056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358602 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358605 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358608 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358611 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358614 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358617 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358620 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358623 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358625 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358631 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358634 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358636 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358643 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358646 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358648 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358651 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358654 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358657 2578 flags.go:64] FLAG: --v="2" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358661 2578 flags.go:64] FLAG: --version="false" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358665 2578 flags.go:64] FLAG: --vmodule="" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358670 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.358673 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358759 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358762 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:25.363684 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358765 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358774 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358776 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358779 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358781 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358784 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358786 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358789 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358792 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358795 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358797 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358800 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358802 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358805 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358807 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358810 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358812 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358815 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358817 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358821 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:25.364284 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358824 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358827 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358829 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358832 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358834 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358836 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358839 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358846 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358849 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358851 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358854 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358856 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358859 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358861 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358869 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358872 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358875 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358877 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358880 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:25.364831 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358882 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358885 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358887 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358890 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358893 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358895 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358898 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358900 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358903 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358905 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358908 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358910 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358913 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358916 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358918 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358921 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358924 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358927 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358929 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:25.365369 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358932 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358936 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358951 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358954 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358957 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358960 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358963 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358965 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358968 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358976 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358979 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358981 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358984 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358987 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358989 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358992 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.358996 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359000 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359003 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359005 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:25.365856 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359008 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359011 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359014 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359018 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359020 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.359023 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:25.366358 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.360075 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:25.366800 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.366783 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:27:25.366836 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.366801 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:27:25.366866 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366856 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:25.366866 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366861 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:25.366866 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366865 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366869 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366872 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366875 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366878 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366881 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366883 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366886 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366889 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366892 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366899 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366902 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366905 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366908 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366911 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366913 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366916 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366919 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366921 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366924 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:25.366958 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366927 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366929 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366932 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366934 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366937 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366954 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366958 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366962 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366966 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366969 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366972 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366974 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366978 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366981 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366984 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366987 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366990 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366993 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366995 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:25.367456 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.366998 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367001 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367003 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367006 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367010 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367012 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367015 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367018 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367020 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367023 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367025 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367028 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367031 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367033 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367035 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367039 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367042 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367045 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367047 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367050 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:25.367918 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367053 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367055 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367058 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367060 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367063 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367065 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367068 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367070 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367073 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367075 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367078 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367080 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367083 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367085 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367088 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367090 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367093 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367096 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367098 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367101 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:25.368457 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367103 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367106 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367108 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367111 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367113 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.367118 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367217 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367222 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367226 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367229 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367232 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367234 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367237 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367240 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367242 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367245 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:25.368933 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367247 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367250 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367252 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367255 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367257 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367260 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367263 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367265 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367268 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367270 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367273 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367276 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367279 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367282 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367285 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367287 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367290 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367292 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367295 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367297 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:25.369401 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367300 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367302 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367305 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367308 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367311 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367314 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367316 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367319 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367321 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367324 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367327 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367329 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367332 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367334 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367337 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367339 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367342 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367344 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367347 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367349 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:25.369883 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367352 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367354 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367357 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367359 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367362 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367366 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367369 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367372 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367375 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367378 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367381 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367383 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367386 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367389 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367391 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367394 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367398 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367401 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367404 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:25.370386 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367406 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367409 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367411 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367414 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367416 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367419 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367421 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367424 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367426 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367429 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367431 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367434 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367436 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367439 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367441 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367444 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:25.370842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:25.367447 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:25.371287 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.367451 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:25.371287 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.367550 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:27:25.371287 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.370775 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:27:25.371783 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.371772 2578 server.go:1019] "Starting client certificate rotation" Apr 16 20:27:25.371898 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.371879 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:25.371967 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.371932 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:25.399501 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.399481 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:25.401973 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.401958 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:25.416438 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.416421 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:27:25.422634 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.422618 2578 log.go:25] "Validated CRI v1 image API" Apr 16 20:27:25.423852 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.423832 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:27:25.425481 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.425465 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:25.428611 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.428594 2578 fs.go:135] Filesystem UUIDs: map[3af47bc0-75cc-4abb-b1ec-62eb3c4bb18c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c1417844-36da-4f39-9091-b249a6f1098e:/dev/nvme0n1p3] Apr 16 20:27:25.428669 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.428612 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:27:25.433406 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.433307 2578 manager.go:217] Machine: {Timestamp:2026-04-16 20:27:25.432263679 +0000 UTC m=+0.403848660 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3123034 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21a0dfa85e591808bbf973303241c6 SystemUUID:ec21a0df-a85e-5918-08bb-f973303241c6 BootID:1ecebc56-049e-4620-a752-0a87ef8f0673 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:12:11:3c:bb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:12:11:3c:bb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:ed:52:e3:ab:ad Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:27:25.433406 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.433401 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:27:25.433509 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.433482 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:27:25.434361 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.434336 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:27:25.434497 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.434363 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-53.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:27:25.434579 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.434508 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:27:25.434579 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.434516 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:27:25.434579 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.434528 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:25.435164 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.435154 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:25.436595 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.436584 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:25.436703 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.436694 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:27:25.439124 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.439115 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:27:25.439155 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.439133 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:27:25.439155 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.439145 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:27:25.439155 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.439154 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:27:25.439240 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.439162 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:27:25.440119 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.440109 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:25.440172 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.440125 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:25.442960 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.442931 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:27:25.444790 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.444778 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:27:25.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446135 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446153 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446159 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446164 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446170 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446176 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446184 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:27:25.446195 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446191 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:27:25.446381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446203 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:27:25.446381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446213 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:27:25.446381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446225 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:27:25.446381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.446239 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:27:25.447213 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.447200 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:27:25.447250 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.447215 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:27:25.451145 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.451131 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:27:25.451226 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.451168 2578 server.go:1295] "Started kubelet" Apr 16 20:27:25.451311 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.451279 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:27:25.451401 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.451269 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:27:25.451444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.451433 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:27:25.452664 ip-10-0-137-53 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:27:25.453338 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.453319 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:27:25.453999 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.453981 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-53.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:27:25.454080 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.453977 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:27:25.454080 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.454042 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-53.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:27:25.455072 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.455054 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:27:25.459262 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.459244 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:27:25.459550 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.459324 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:25.460145 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460114 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:27:25.460223 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460175 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:27:25.460291 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460251 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:27:25.460370 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460357 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:27:25.460534 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460522 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:27:25.460967 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460935 2578 factory.go:153] Registering CRI-O factory Apr 16 20:27:25.461031 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.460983 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 20:27:25.461031 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461026 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:27:25.461117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461035 2578 factory.go:55] Registering systemd factory Apr 16 20:27:25.461117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461043 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:27:25.461117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461060 2578 factory.go:103] Registering Raw factory Apr 16 20:27:25.461117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461074 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 20:27:25.461748 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.460596 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-53.ec2.internal.18a6f03c7255655d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-53.ec2.internal,UID:ip-10-0-137-53.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-53.ec2.internal,},FirstTimestamp:2026-04-16 20:27:25.451142493 +0000 UTC m=+0.422727474,LastTimestamp:2026-04-16 20:27:25.451142493 +0000 UTC m=+0.422727474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-53.ec2.internal,}" Apr 16 20:27:25.461748 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.461741 2578 manager.go:319] Starting recovery of all containers Apr 16 20:27:25.461892 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.461811 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.464049 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.464012 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:27:25.470871 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.470855 2578 manager.go:324] Recovery completed Apr 16 20:27:25.472709 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.472569 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-53.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:27:25.472776 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.472574 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:27:25.473193 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.473172 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-26g6m" Apr 16 20:27:25.475482 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.475470 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.478599 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.478584 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-26g6m" Apr 16 20:27:25.480498 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.480483 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.480575 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.480508 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.480575 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.480518 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.481051 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.481035 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:27:25.481051 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.481048 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:27:25.481186 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.481063 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:25.483347 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.483335 2578 policy_none.go:49] "None policy: Start" Apr 16 20:27:25.483385 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.483351 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:27:25.483385 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.483360 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:27:25.535158 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535145 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 20:27:25.535255 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.535172 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:27:25.535255 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535181 2578 server.go:85] "Starting device plugin registration server" Apr 16 20:27:25.535423 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535411 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:27:25.535485 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535425 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:27:25.535546 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535527 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:27:25.535625 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535609 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:27:25.535625 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.535623 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:27:25.536159 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.536140 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:27:25.536227 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.536173 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.599468 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.599439 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:27:25.600619 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.600598 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:27:25.600619 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.600621 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:27:25.600761 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.600636 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:27:25.600761 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.600647 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:27:25.600761 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.600683 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:27:25.603414 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.603373 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:25.636140 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.636126 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.636849 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.636835 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.636897 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.636862 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.636897 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.636875 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.636897 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.636894 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.643538 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.643526 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.643579 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.643544 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-53.ec2.internal\": node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.658745 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.658727 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.700923 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.700898 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal"] Apr 16 20:27:25.700992 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.700972 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.703135 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.703114 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.703218 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.703141 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.703218 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.703152 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.704207 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.704195 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.704368 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.704352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.704427 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.704386 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.706052 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706038 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.706128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706060 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.706128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706069 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.706128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706083 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.706128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706103 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.706128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.706115 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.707247 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.707233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.707302 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.707260 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:25.707928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.707915 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:25.708022 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.707935 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:25.708022 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.707961 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:25.722452 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.722435 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-53.ec2.internal\" not found" node="ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.726743 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.726728 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-53.ec2.internal\" not found" node="ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.759804 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.759785 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.860665 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.860618 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.861762 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.861747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.861816 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.861773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.861816 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.861789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbf50ce40dd2f1fe9119f12e04feaaf5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-53.ec2.internal\" (UID: \"dbf50ce40dd2f1fe9119f12e04feaaf5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.961235 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:25.961215 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:25.962351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.962420 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.962420 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbf50ce40dd2f1fe9119f12e04feaaf5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-53.ec2.internal\" (UID: \"dbf50ce40dd2f1fe9119f12e04feaaf5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.962504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.962504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbf50ce40dd2f1fe9119f12e04feaaf5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-53.ec2.internal\" (UID: \"dbf50ce40dd2f1fe9119f12e04feaaf5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:25.962504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:25.962430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d2a2667ea6d6ce4fbc4ca25ae7466fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal\" (UID: \"3d2a2667ea6d6ce4fbc4ca25ae7466fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:26.024474 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.024457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:26.028843 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.028828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:26.061551 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.061531 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:26.161955 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.161901 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:26.262296 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.262280 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:26.362714 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.362692 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-53.ec2.internal\" not found" Apr 16 20:27:26.371908 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.371897 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:27:26.372028 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.372014 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:26.388428 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.388410 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:26.388484 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.388457 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:26.440342 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.440293 2578 apiserver.go:52] "Watching apiserver" Apr 16 20:27:26.448752 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.448730 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:27:26.450659 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.450640 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-ks4qt","openshift-cluster-node-tuning-operator/tuned-9lrtn","openshift-network-operator/iptables-alerter-tlddj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h","openshift-dns/node-resolver-ltdrj","openshift-image-registry/node-ca-l6t5h","openshift-multus/multus-8v2qr","openshift-multus/multus-additional-cni-plugins-v5lrp","openshift-multus/network-metrics-daemon-tjgd4","openshift-network-diagnostics/network-check-target-p2hmj","openshift-ovn-kubernetes/ovnkube-node-wsr25"] Apr 16 20:27:26.452084 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.452068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.453570 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.453336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.454900 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.454876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:27:26.454992 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.454924 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:27:26.455612 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.455592 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4hfkv\"" Apr 16 20:27:26.455707 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.455634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fm5nh\"" Apr 16 20:27:26.456165 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.456143 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.456165 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.456158 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.456288 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.456238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.456418 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.456398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.458188 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.458515 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458500 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.458569 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458555 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.458724 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458706 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:27:26.458724 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458722 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7ncss\"" Apr 16 20:27:26.458847 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458763 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.458847 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.458763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mxqm9\"" Apr 16 20:27:26.459307 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.459291 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.459432 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.459418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.459901 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.459883 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" Apr 16 20:27:26.459901 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.459894 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:26.460248 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.460231 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.460311 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.460250 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.460311 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.460298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6lc9h\"" Apr 16 20:27:26.460414 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.460249 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:27:26.460711 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.460698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.461305 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461290 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-54nkv\"" Apr 16 20:27:26.461379 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461298 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:27:26.461379 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461331 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:27:26.461499 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461465 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.461579 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461566 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.461832 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.461821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.461889 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.461874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:26.462692 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.462676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:27:26.462849 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.462831 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:27:26.462915 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.462889 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:26.462995 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.462893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t4m24\"" Apr 16 20:27:26.462995 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.462978 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:26.464061 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.464061 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cnibin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-sys\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-lib-modules\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42bp\" (UniqueName: \"kubernetes.io/projected/8ab3b913-84db-4ba1-b661-7f31e898d4c1-kube-api-access-k42bp\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-os-release\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.464204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-modprobe-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn8t\" (UniqueName: \"kubernetes.io/projected/4294b5bf-8386-4471-acff-325cd2af8bfb-kube-api-access-nzn8t\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae461ccf-29e2-4d21-9a23-605ab7aac065-serviceca\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/10df2057-e291-4486-a0e1-89f26c4dbee9-hosts-file\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-multus-certs\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysconfig\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-socket-dir-parent\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464417 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-bin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-os-release\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-k8s-cni-cncf-io\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-run\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-netns\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-cnibin\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.464537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7cx\" (UniqueName: \"kubernetes.io/projected/facc1f83-664e-47d0-a625-cb11e5043a9e-kube-api-access-hg7cx\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cni-binary-copy\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-system-cni-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-conf\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-registration-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-etc-selinux\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-sys-fs\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-kubelet\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42e24607-0fa8-4092-8d9d-a523e82990ff-agent-certs\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-systemd\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-host\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-tuned\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae461ccf-29e2-4d21-9a23-605ab7aac065-host\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-socket-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465190 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-hostroot\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.464980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-etc-kubernetes\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-system-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5pg\" (UniqueName: \"kubernetes.io/projected/10df2057-e291-4486-a0e1-89f26c4dbee9-kube-api-access-jr5pg\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-multus\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-kubernetes\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-var-lib-kubelet\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-device-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-daemon-config\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjrf\" (UniqueName: \"kubernetes.io/projected/151f9e87-8756-4668-8ff7-d2417f6d4658-kube-api-access-twjrf\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/10df2057-e291-4486-a0e1-89f26c4dbee9-tmp-dir\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-conf-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42e24607-0fa8-4092-8d9d-a523e82990ff-konnectivity-ca\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-tmp\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.465781 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.465397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scrfb\" (UniqueName: \"kubernetes.io/projected/ae461ccf-29e2-4d21-9a23-605ab7aac065-kube-api-access-scrfb\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.466653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466140 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:27:26.466653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466318 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.466653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466377 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.466653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466403 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q49nx\"" Apr 16 20:27:26.466653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466481 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:27:26.466870 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466679 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:27:26.466870 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.466680 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:27:26.467338 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.467324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:26.467809 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.467793 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:26.467901 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.467832 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:27:26.468111 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.468081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wqkxr\"" Apr 16 20:27:26.470292 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.470276 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:27:26.470365 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.470334 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" Apr 16 20:27:26.470471 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.470449 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal"] Apr 16 20:27:26.473206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.473153 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:26.480882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.480854 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:22:25 +0000 UTC" deadline="2027-10-22 01:02:08.94506277 +0000 UTC" Apr 16 20:27:26.480882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.480881 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13276h34m42.464183845s" Apr 16 20:27:26.485689 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.484033 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:27:26.486258 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.486239 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal"] Apr 16 20:27:26.493767 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.493751 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jz897" Apr 16 20:27:26.501802 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.501784 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jz897" Apr 16 20:27:26.561406 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.561384 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:27:26.565639 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.565721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cnibin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.565721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-sys\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.565721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-lib-modules\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.565831 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovn-node-metrics-cert\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.565831 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cnibin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.565831 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k42bp\" (UniqueName: \"kubernetes.io/projected/8ab3b913-84db-4ba1-b661-7f31e898d4c1-kube-api-access-k42bp\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.565831 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.565831 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-os-release\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-lib-modules\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-sys\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-os-release\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-modprobe-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn8t\" (UniqueName: \"kubernetes.io/projected/4294b5bf-8386-4471-acff-325cd2af8bfb-kube-api-access-nzn8t\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-netns\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.566002 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.565978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-env-overrides\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae461ccf-29e2-4d21-9a23-605ab7aac065-serviceca\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-modprobe-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/10df2057-e291-4486-a0e1-89f26c4dbee9-hosts-file\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-multus-certs\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysconfig\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/10df2057-e291-4486-a0e1-89f26c4dbee9-hosts-file\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc88c\" (UniqueName: \"kubernetes.io/projected/931ff401-6150-4e87-828a-2e3a9242e1bc-kube-api-access-fc88c\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-multus-certs\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-config\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-socket-dir-parent\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysconfig\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-bin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-bin\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.566928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-os-release\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-socket-dir-parent\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-k8s-cni-cncf-io\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.566928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae461ccf-29e2-4d21-9a23-605ab7aac065-serviceca\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.567444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-k8s-cni-cncf-io\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.567493 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567493 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-os-release\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.567493 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.566852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-run\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-node-log\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.567591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-bin\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.567591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-netns\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-run\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-cnibin\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7cx\" (UniqueName: \"kubernetes.io/projected/facc1f83-664e-47d0-a625-cb11e5043a9e-kube-api-access-hg7cx\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-run-netns\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cni-binary-copy\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.567721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-cnibin\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-system-cni-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-d\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-conf\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-log-socket\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmwj\" (UniqueName: \"kubernetes.io/projected/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-kube-api-access-6bmwj\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-sysctl-conf\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-registration-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-system-cni-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-etc-selinux\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-sys-fs\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-registration-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.567964 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.567964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-kubelet\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-sys-fs\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-kubelet\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-ovn\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-etc-selinux\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-cni-binary-copy\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42e24607-0fa8-4092-8d9d-a523e82990ff-agent-certs\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-systemd\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-host\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-tuned\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae461ccf-29e2-4d21-9a23-605ab7aac065-host\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-systemd\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-kubelet\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-socket-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-hostroot\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-etc-kubernetes\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.568472 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-host\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-hostroot\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-etc-kubernetes\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-socket-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-var-lib-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvh6p\" (UniqueName: \"kubernetes.io/projected/2f6150f6-900c-430c-8252-6f51cf51afc9-kube-api-access-vvh6p\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568630 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-system-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/151f9e87-8756-4668-8ff7-d2417f6d4658-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f6150f6-900c-430c-8252-6f51cf51afc9-iptables-alerter-script\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-system-cni-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f6150f6-900c-430c-8252-6f51cf51afc9-host-slash\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5pg\" (UniqueName: \"kubernetes.io/projected/10df2057-e291-4486-a0e1-89f26c4dbee9-kube-api-access-jr5pg\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-multus\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae461ccf-29e2-4d21-9a23-605ab7aac065-host\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.569176 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-kubernetes\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-host-var-lib-cni-multus\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-var-lib-kubelet\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-device-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-systemd-units\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-var-lib-kubelet\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-device-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.568986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-script-lib\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-daemon-config\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569064 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twjrf\" (UniqueName: \"kubernetes.io/projected/151f9e87-8756-4668-8ff7-d2417f6d4658-kube-api-access-twjrf\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-slash\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-etc-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/10df2057-e291-4486-a0e1-89f26c4dbee9-tmp-dir\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-conf-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42e24607-0fa8-4092-8d9d-a523e82990ff-konnectivity-ca\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.569969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-tmp\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scrfb\" (UniqueName: \"kubernetes.io/projected/ae461ccf-29e2-4d21-9a23-605ab7aac065-kube-api-access-scrfb\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-systemd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-netd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-kubernetes\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/151f9e87-8756-4668-8ff7-d2417f6d4658-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-daemon-config\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ab3b913-84db-4ba1-b661-7f31e898d4c1-multus-conf-dir\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.569866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/10df2057-e291-4486-a0e1-89f26c4dbee9-tmp-dir\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.570018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42e24607-0fa8-4092-8d9d-a523e82990ff-konnectivity-ca\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.570764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.570103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/facc1f83-664e-47d0-a625-cb11e5043a9e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.571915 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.571743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-etc-tuned\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.574641 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.572059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42e24607-0fa8-4092-8d9d-a523e82990ff-agent-certs\") pod \"konnectivity-agent-ks4qt\" (UID: \"42e24607-0fa8-4092-8d9d-a523e82990ff\") " pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.574641 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.573103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4294b5bf-8386-4471-acff-325cd2af8bfb-tmp\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.575999 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.575962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7cx\" (UniqueName: \"kubernetes.io/projected/facc1f83-664e-47d0-a625-cb11e5043a9e-kube-api-access-hg7cx\") pod \"aws-ebs-csi-driver-node-zws8h\" (UID: \"facc1f83-664e-47d0-a625-cb11e5043a9e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.576206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.576187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42bp\" (UniqueName: \"kubernetes.io/projected/8ab3b913-84db-4ba1-b661-7f31e898d4c1-kube-api-access-k42bp\") pod \"multus-8v2qr\" (UID: \"8ab3b913-84db-4ba1-b661-7f31e898d4c1\") " pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.576795 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.576765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5pg\" (UniqueName: \"kubernetes.io/projected/10df2057-e291-4486-a0e1-89f26c4dbee9-kube-api-access-jr5pg\") pod \"node-resolver-ltdrj\" (UID: \"10df2057-e291-4486-a0e1-89f26c4dbee9\") " pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.577097 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.577074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn8t\" (UniqueName: \"kubernetes.io/projected/4294b5bf-8386-4471-acff-325cd2af8bfb-kube-api-access-nzn8t\") pod \"tuned-9lrtn\" (UID: \"4294b5bf-8386-4471-acff-325cd2af8bfb\") " pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.577867 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.577843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjrf\" (UniqueName: \"kubernetes.io/projected/151f9e87-8756-4668-8ff7-d2417f6d4658-kube-api-access-twjrf\") pod \"multus-additional-cni-plugins-v5lrp\" (UID: \"151f9e87-8756-4668-8ff7-d2417f6d4658\") " pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.578243 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.578225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scrfb\" (UniqueName: \"kubernetes.io/projected/ae461ccf-29e2-4d21-9a23-605ab7aac065-kube-api-access-scrfb\") pod \"node-ca-l6t5h\" (UID: \"ae461ccf-29e2-4d21-9a23-605ab7aac065\") " pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.606599 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.606583 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" Apr 16 20:27:26.659198 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.659173 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf50ce40dd2f1fe9119f12e04feaaf5.slice/crio-3eeea82ecb2efac8a721dcc6959ca04aa1f6decc580d64c665c9a7b605630d62 WatchSource:0}: Error finding container 3eeea82ecb2efac8a721dcc6959ca04aa1f6decc580d64c665c9a7b605630d62: Status 404 returned error can't find the container with id 3eeea82ecb2efac8a721dcc6959ca04aa1f6decc580d64c665c9a7b605630d62 Apr 16 20:27:26.659654 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.659640 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2a2667ea6d6ce4fbc4ca25ae7466fd.slice/crio-3e7b193dd208711f87babc8f11e8d917cfb18fc243070ff5098b6da0013ac1a6 WatchSource:0}: Error finding container 3e7b193dd208711f87babc8f11e8d917cfb18fc243070ff5098b6da0013ac1a6: Status 404 returned error can't find the container with id 3e7b193dd208711f87babc8f11e8d917cfb18fc243070ff5098b6da0013ac1a6 Apr 16 20:27:26.663933 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.663920 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:27:26.670387 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.670449 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-var-lib-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670449 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvh6p\" (UniqueName: \"kubernetes.io/projected/2f6150f6-900c-430c-8252-6f51cf51afc9-kube-api-access-vvh6p\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.670449 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f6150f6-900c-430c-8252-6f51cf51afc9-iptables-alerter-script\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.670558 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-var-lib-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670558 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f6150f6-900c-430c-8252-6f51cf51afc9-host-slash\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.670558 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.670499 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:26.670558 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-systemd-units\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670558 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-script-lib\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.670563 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:27.170534335 +0000 UTC m=+2.142119305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-slash\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-systemd-units\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f6150f6-900c-430c-8252-6f51cf51afc9-host-slash\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-etc-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-etc-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-slash\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-systemd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-netd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.670785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-systemd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovn-node-metrics-cert\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-netd\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-netns\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-env-overrides\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-run-netns\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc88c\" (UniqueName: \"kubernetes.io/projected/931ff401-6150-4e87-828a-2e3a9242e1bc-kube-api-access-fc88c\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-config\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.670999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f6150f6-900c-430c-8252-6f51cf51afc9-iptables-alerter-script\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-node-log\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-bin\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671131 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-script-lib\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-log-socket\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmwj\" (UniqueName: \"kubernetes.io/projected/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-kube-api-access-6bmwj\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-openvswitch\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-ovn\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-node-log\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-log-socket\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-env-overrides\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-run-ovn\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-kubelet\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-kubelet\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-host-cni-bin\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.671805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.671661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovnkube-config\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.672896 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.672879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-ovn-node-metrics-cert\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.687733 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.687711 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:26.687733 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.687732 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:26.687841 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.687745 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:26.687841 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:26.687815 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:27.187799802 +0000 UTC m=+2.159384773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:26.689671 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.689649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvh6p\" (UniqueName: \"kubernetes.io/projected/2f6150f6-900c-430c-8252-6f51cf51afc9-kube-api-access-vvh6p\") pod \"iptables-alerter-tlddj\" (UID: \"2f6150f6-900c-430c-8252-6f51cf51afc9\") " pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.690920 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.690870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc88c\" (UniqueName: \"kubernetes.io/projected/931ff401-6150-4e87-828a-2e3a9242e1bc-kube-api-access-fc88c\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:26.691010 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.690992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmwj\" (UniqueName: \"kubernetes.io/projected/e4de5fe5-38a8-4fca-b8aa-9b340811b2f5-kube-api-access-6bmwj\") pod \"ovnkube-node-wsr25\" (UID: \"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.777838 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.777817 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:26.790039 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.790020 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:26.795049 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.795023 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e24607_0fa8_4092_8d9d_a523e82990ff.slice/crio-6e935b5f311a2a417f09dec27929f0f9f91e7b0feda45c87be93989b1574ada6 WatchSource:0}: Error finding container 6e935b5f311a2a417f09dec27929f0f9f91e7b0feda45c87be93989b1574ada6: Status 404 returned error can't find the container with id 6e935b5f311a2a417f09dec27929f0f9f91e7b0feda45c87be93989b1574ada6 Apr 16 20:27:26.803734 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.803718 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" Apr 16 20:27:26.809040 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.809020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4294b5bf_8386_4471_acff_325cd2af8bfb.slice/crio-3b63b40b7115f0d6a36bcd430bbd8fb6a29713ed8bb87f11e3e74084315cd1a6 WatchSource:0}: Error finding container 3b63b40b7115f0d6a36bcd430bbd8fb6a29713ed8bb87f11e3e74084315cd1a6: Status 404 returned error can't find the container with id 3b63b40b7115f0d6a36bcd430bbd8fb6a29713ed8bb87f11e3e74084315cd1a6 Apr 16 20:27:26.827023 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.827002 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" Apr 16 20:27:26.832283 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.832260 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacc1f83_664e_47d0_a625_cb11e5043a9e.slice/crio-60a418e1988e23f9c6139a823602afd6f4a7005e56b974551f2942d11b1d516b WatchSource:0}: Error finding container 60a418e1988e23f9c6139a823602afd6f4a7005e56b974551f2942d11b1d516b: Status 404 returned error can't find the container with id 60a418e1988e23f9c6139a823602afd6f4a7005e56b974551f2942d11b1d516b Apr 16 20:27:26.843122 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.843106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ltdrj" Apr 16 20:27:26.848864 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.848842 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10df2057_e291_4486_a0e1_89f26c4dbee9.slice/crio-a9e736027d4e73972ad44b29bf3cc7c56e31f688decb8d87eac272518a75ac56 WatchSource:0}: Error finding container a9e736027d4e73972ad44b29bf3cc7c56e31f688decb8d87eac272518a75ac56: Status 404 returned error can't find the container with id a9e736027d4e73972ad44b29bf3cc7c56e31f688decb8d87eac272518a75ac56 Apr 16 20:27:26.861402 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.861380 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l6t5h" Apr 16 20:27:26.867963 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.867923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8v2qr" Apr 16 20:27:26.868440 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.868388 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae461ccf_29e2_4d21_9a23_605ab7aac065.slice/crio-8e9b1011ee82473707d057bf2fdf6e1e66184a0ccc7f8bb1fc689b994e466e9b WatchSource:0}: Error finding container 8e9b1011ee82473707d057bf2fdf6e1e66184a0ccc7f8bb1fc689b994e466e9b: Status 404 returned error can't find the container with id 8e9b1011ee82473707d057bf2fdf6e1e66184a0ccc7f8bb1fc689b994e466e9b Apr 16 20:27:26.873405 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.873387 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab3b913_84db_4ba1_b661_7f31e898d4c1.slice/crio-88fe09acf67def33f5ef62846d7816620ae3d75eefb0ca4327bb51b0d770ffae WatchSource:0}: Error finding container 88fe09acf67def33f5ef62846d7816620ae3d75eefb0ca4327bb51b0d770ffae: Status 404 returned error can't find the container with id 88fe09acf67def33f5ef62846d7816620ae3d75eefb0ca4327bb51b0d770ffae Apr 16 20:27:26.927066 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.927044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:26.932491 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:26.932474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tlddj" Apr 16 20:27:26.933043 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.933028 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4de5fe5_38a8_4fca_b8aa_9b340811b2f5.slice/crio-a79e95423bbcf46e712f92e8e7ba733350184ce472cd50a8bcc17acb3df74839 WatchSource:0}: Error finding container a79e95423bbcf46e712f92e8e7ba733350184ce472cd50a8bcc17acb3df74839: Status 404 returned error can't find the container with id a79e95423bbcf46e712f92e8e7ba733350184ce472cd50a8bcc17acb3df74839 Apr 16 20:27:26.938300 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:26.938279 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6150f6_900c_430c_8252_6f51cf51afc9.slice/crio-ab23507dc35c8961f928301790bf6492646bbe782abb419841e833605a572eef WatchSource:0}: Error finding container ab23507dc35c8961f928301790bf6492646bbe782abb419841e833605a572eef: Status 404 returned error can't find the container with id ab23507dc35c8961f928301790bf6492646bbe782abb419841e833605a572eef Apr 16 20:27:27.013262 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:27.013236 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151f9e87_8756_4668_8ff7_d2417f6d4658.slice/crio-e5769dae85b2ae079ddd7d3103875cb71a7b04631e2ff3cd123848098444c4e5 WatchSource:0}: Error finding container e5769dae85b2ae079ddd7d3103875cb71a7b04631e2ff3cd123848098444c4e5: Status 404 returned error can't find the container with id e5769dae85b2ae079ddd7d3103875cb71a7b04631e2ff3cd123848098444c4e5 Apr 16 20:27:27.174412 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.174384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:27.174529 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.174514 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:27.174640 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.174571 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:28.174553714 +0000 UTC m=+3.146138696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:27.275604 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.275538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:27.275729 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.275671 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:27.275729 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.275687 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:27.275729 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.275698 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:27.275889 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.275750 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:28.275732094 +0000 UTC m=+3.247317076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:27.309798 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.309594 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:27.504213 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.504169 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:26 +0000 UTC" deadline="2027-10-04 15:24:00.262533214 +0000 UTC" Apr 16 20:27:27.504978 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.504219 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12858h56m32.758319311s" Apr 16 20:27:27.603755 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.603683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:27.603905 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:27.603808 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:27.631645 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.631579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" event={"ID":"dbf50ce40dd2f1fe9119f12e04feaaf5","Type":"ContainerStarted","Data":"3eeea82ecb2efac8a721dcc6959ca04aa1f6decc580d64c665c9a7b605630d62"} Apr 16 20:27:27.637192 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.637165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerStarted","Data":"e5769dae85b2ae079ddd7d3103875cb71a7b04631e2ff3cd123848098444c4e5"} Apr 16 20:27:27.649782 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.649750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"a79e95423bbcf46e712f92e8e7ba733350184ce472cd50a8bcc17acb3df74839"} Apr 16 20:27:27.655333 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.654769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8v2qr" event={"ID":"8ab3b913-84db-4ba1-b661-7f31e898d4c1","Type":"ContainerStarted","Data":"88fe09acf67def33f5ef62846d7816620ae3d75eefb0ca4327bb51b0d770ffae"} Apr 16 20:27:27.660449 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.660424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l6t5h" event={"ID":"ae461ccf-29e2-4d21-9a23-605ab7aac065","Type":"ContainerStarted","Data":"8e9b1011ee82473707d057bf2fdf6e1e66184a0ccc7f8bb1fc689b994e466e9b"} Apr 16 20:27:27.664089 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.664065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ltdrj" event={"ID":"10df2057-e291-4486-a0e1-89f26c4dbee9","Type":"ContainerStarted","Data":"a9e736027d4e73972ad44b29bf3cc7c56e31f688decb8d87eac272518a75ac56"} Apr 16 20:27:27.667879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.667857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" event={"ID":"facc1f83-664e-47d0-a625-cb11e5043a9e","Type":"ContainerStarted","Data":"60a418e1988e23f9c6139a823602afd6f4a7005e56b974551f2942d11b1d516b"} Apr 16 20:27:27.674204 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.674162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tlddj" event={"ID":"2f6150f6-900c-430c-8252-6f51cf51afc9","Type":"ContainerStarted","Data":"ab23507dc35c8961f928301790bf6492646bbe782abb419841e833605a572eef"} Apr 16 20:27:27.679227 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.679200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" event={"ID":"4294b5bf-8386-4471-acff-325cd2af8bfb","Type":"ContainerStarted","Data":"3b63b40b7115f0d6a36bcd430bbd8fb6a29713ed8bb87f11e3e74084315cd1a6"} Apr 16 20:27:27.691128 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.691105 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ks4qt" event={"ID":"42e24607-0fa8-4092-8d9d-a523e82990ff","Type":"ContainerStarted","Data":"6e935b5f311a2a417f09dec27929f0f9f91e7b0feda45c87be93989b1574ada6"} Apr 16 20:27:27.705591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:27.705568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" event={"ID":"3d2a2667ea6d6ce4fbc4ca25ae7466fd","Type":"ContainerStarted","Data":"3e7b193dd208711f87babc8f11e8d917cfb18fc243070ff5098b6da0013ac1a6"} Apr 16 20:27:28.182516 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:28.182480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:28.182735 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.182648 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:28.182735 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.182716 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:30.182696075 +0000 UTC m=+5.154281078 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:28.283478 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:28.283426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:28.283636 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.283586 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:28.283636 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.283607 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:28.283636 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.283619 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:28.283807 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.283677 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:30.283658724 +0000 UTC m=+5.255243696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:28.506293 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:28.506174 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:26 +0000 UTC" deadline="2027-10-15 16:24:54.30711341 +0000 UTC" Apr 16 20:27:28.506293 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:28.506213 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13123h57m25.800904476s" Apr 16 20:27:28.601615 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:28.601526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:28.601826 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:28.601647 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:29.604248 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:29.603734 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:29.604248 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:29.603871 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:30.199921 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:30.199887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:30.200122 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.200059 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:30.200183 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.200129 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:34.200108617 +0000 UTC m=+9.171693591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:30.300775 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:30.300743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:30.300982 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.300960 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:30.301074 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.300990 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:30.301074 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.301003 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:30.301074 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.301068 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:34.30104921 +0000 UTC m=+9.272634183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:30.601421 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:30.601336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:30.601577 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:30.601482 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:31.601282 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:31.601197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:31.601741 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:31.601377 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:32.601722 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:32.601683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:32.602266 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:32.601813 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:33.478678 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.478475 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-25wtt"] Apr 16 20:27:33.481530 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.481508 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.481677 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:33.481588 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:33.524616 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.524584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.524788 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.524659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-kubelet-config\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.524788 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.524714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-dbus\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.601008 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.600975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:33.601186 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:33.601120 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:33.625833 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.625795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-kubelet-config\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.626289 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.625843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-dbus\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.626289 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.625901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.626289 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:33.626027 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:33.626289 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:33.626087 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:34.126068724 +0000 UTC m=+9.097653697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:33.626526 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.626324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-kubelet-config\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:33.626526 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:33.626470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/469d4528-df9f-44e0-a586-1d85f9f8a444-dbus\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:34.128937 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:34.128880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:34.129124 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.129079 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:34.129188 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.129152 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:35.129131067 +0000 UTC m=+10.100716038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:34.230084 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:34.230048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:34.230249 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.230231 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.230322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.230293 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:42.230278038 +0000 UTC m=+17.201863026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.331279 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:34.331249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:34.331459 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.331386 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:34.331459 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.331404 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:34.331459 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.331414 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:34.331614 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.331473 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:42.331454658 +0000 UTC m=+17.303039646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:34.600969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:34.600873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:34.600969 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:34.600885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:34.601163 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.600999 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:34.601163 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:34.601066 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:35.137937 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:35.137903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:35.138336 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:35.138067 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:35.138336 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:35.138135 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:37.138117346 +0000 UTC m=+12.109702330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:35.604497 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:35.604446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:35.604683 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:35.604628 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:36.601823 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.601675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:36.601823 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:36.601805 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:36.602270 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.601855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:36.602270 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:36.601973 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:36.730079 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.729796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" event={"ID":"4294b5bf-8386-4471-acff-325cd2af8bfb","Type":"ContainerStarted","Data":"336f192ff9c86257011c3351b4da68c5491b18c3939d2bf7f3b1385846c66f07"} Apr 16 20:27:36.732877 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.732848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" event={"ID":"dbf50ce40dd2f1fe9119f12e04feaaf5","Type":"ContainerStarted","Data":"393d8dda088afaab9f48ffcf53fb6d4c8aeaac5660792aadbfccadeb59683f30"} Apr 16 20:27:36.744717 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.744550 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9lrtn" podStartSLOduration=1.977431655 podStartE2EDuration="11.744532033s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.811020694 +0000 UTC m=+1.782605662" lastFinishedPulling="2026-04-16 20:27:36.578121059 +0000 UTC m=+11.549706040" observedRunningTime="2026-04-16 20:27:36.743854271 +0000 UTC m=+11.715439263" watchObservedRunningTime="2026-04-16 20:27:36.744532033 +0000 UTC m=+11.716117026" Apr 16 20:27:36.755494 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:36.755212 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-53.ec2.internal" podStartSLOduration=10.755195574 podStartE2EDuration="10.755195574s" podCreationTimestamp="2026-04-16 20:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:36.754733149 +0000 UTC m=+11.726318141" watchObservedRunningTime="2026-04-16 20:27:36.755195574 +0000 UTC m=+11.726780564" Apr 16 20:27:37.153187 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.153143 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:37.153370 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:37.153355 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:37.153436 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:37.153418 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.153400881 +0000 UTC m=+16.124985872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:37.604213 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.604185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:37.604632 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:37.604313 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:37.736458 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.736406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l6t5h" event={"ID":"ae461ccf-29e2-4d21-9a23-605ab7aac065","Type":"ContainerStarted","Data":"d9711e4ff2294785bb92bb3ecd2ed0b05266c03df35d336cc51f570ff8229f57"} Apr 16 20:27:37.737930 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.737902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ltdrj" event={"ID":"10df2057-e291-4486-a0e1-89f26c4dbee9","Type":"ContainerStarted","Data":"f86b36e3f3a9e1990d304af34b649f5c05c40d1dd256bcd1b65a5dc5468bfd21"} Apr 16 20:27:37.739853 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.739825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" event={"ID":"facc1f83-664e-47d0-a625-cb11e5043a9e","Type":"ContainerStarted","Data":"68fe222f72d78731243bd11ef5510fe0ee6d275ec9cc830dc76e63bc1b0103fa"} Apr 16 20:27:37.741430 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.741407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ks4qt" event={"ID":"42e24607-0fa8-4092-8d9d-a523e82990ff","Type":"ContainerStarted","Data":"d22cd3cc143c974d9157824b31179ff4fcc750cfb7b92e1d03b58f3969380637"} Apr 16 20:27:37.743048 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.743018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" event={"ID":"3d2a2667ea6d6ce4fbc4ca25ae7466fd","Type":"ContainerStarted","Data":"fc7f1a1084a989afd9973a9f0339fcc5c3f6e1744da156208fdc58f29d93ea07"} Apr 16 20:27:37.744863 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.744841 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerStarted","Data":"00d473588b606ecaa4afe4272a22f5ef41b5bfa9dbcf436c8e7836c1ba272aca"} Apr 16 20:27:37.748575 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.748530 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l6t5h" podStartSLOduration=3.047831564 podStartE2EDuration="12.74851733s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.870701337 +0000 UTC m=+1.842286305" lastFinishedPulling="2026-04-16 20:27:36.571387101 +0000 UTC m=+11.542972071" observedRunningTime="2026-04-16 20:27:37.748304934 +0000 UTC m=+12.719889934" watchObservedRunningTime="2026-04-16 20:27:37.74851733 +0000 UTC m=+12.720102324" Apr 16 20:27:37.786316 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.786252 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ks4qt" podStartSLOduration=3.01233976 podStartE2EDuration="12.786236379s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.796453499 +0000 UTC m=+1.768038467" lastFinishedPulling="2026-04-16 20:27:36.570350113 +0000 UTC m=+11.541935086" observedRunningTime="2026-04-16 20:27:37.785618554 +0000 UTC m=+12.757203544" watchObservedRunningTime="2026-04-16 20:27:37.786236379 +0000 UTC m=+12.757821370" Apr 16 20:27:37.797588 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:37.797542 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ltdrj" podStartSLOduration=3.07783981 podStartE2EDuration="12.797527962s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.85018321 +0000 UTC m=+1.821768179" lastFinishedPulling="2026-04-16 20:27:36.569871346 +0000 UTC m=+11.541456331" observedRunningTime="2026-04-16 20:27:37.797480474 +0000 UTC m=+12.769065479" watchObservedRunningTime="2026-04-16 20:27:37.797527962 +0000 UTC m=+12.769112952" Apr 16 20:27:38.601961 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:38.601843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:38.601961 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:38.601846 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:38.602381 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:38.601990 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:38.602381 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:38.602053 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:38.747662 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:38.747610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tlddj" event={"ID":"2f6150f6-900c-430c-8252-6f51cf51afc9","Type":"ContainerStarted","Data":"d850fecede759f5d2f5c9526e978667f9fd85d72a2eebbb606ce498f2b6a794c"} Apr 16 20:27:39.601812 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:39.601776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:39.601988 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:39.601901 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:40.601094 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:40.601058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:40.601695 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:40.601107 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:40.601695 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:40.601193 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:40.601695 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:40.601281 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:41.184558 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.184508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:41.184727 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:41.184665 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:41.184727 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:41.184723 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:49.184705318 +0000 UTC m=+24.156290287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:41.601058 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.601026 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:41.601213 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:41.601135 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:41.752271 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.752238 2578 generic.go:358] "Generic (PLEG): container finished" podID="3d2a2667ea6d6ce4fbc4ca25ae7466fd" containerID="fc7f1a1084a989afd9973a9f0339fcc5c3f6e1744da156208fdc58f29d93ea07" exitCode=0 Apr 16 20:27:41.752427 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.752306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" event={"ID":"3d2a2667ea6d6ce4fbc4ca25ae7466fd","Type":"ContainerDied","Data":"fc7f1a1084a989afd9973a9f0339fcc5c3f6e1744da156208fdc58f29d93ea07"} Apr 16 20:27:41.753806 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.753780 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="00d473588b606ecaa4afe4272a22f5ef41b5bfa9dbcf436c8e7836c1ba272aca" exitCode=0 Apr 16 20:27:41.753928 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.753821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"00d473588b606ecaa4afe4272a22f5ef41b5bfa9dbcf436c8e7836c1ba272aca"} Apr 16 20:27:41.766177 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:41.766129 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tlddj" podStartSLOduration=6.127255557 podStartE2EDuration="15.766112605s" podCreationTimestamp="2026-04-16 20:27:26 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.939632017 +0000 UTC m=+1.911216986" lastFinishedPulling="2026-04-16 20:27:36.578489066 +0000 UTC m=+11.550074034" observedRunningTime="2026-04-16 20:27:38.761488842 +0000 UTC m=+13.733073836" watchObservedRunningTime="2026-04-16 20:27:41.766112605 +0000 UTC m=+16.737697599" Apr 16 20:27:42.292033 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.291999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:42.292218 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.292111 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:42.292218 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.292173 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:27:58.292156069 +0000 UTC m=+33.263741043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:42.322025 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.322001 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:42.322868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.322851 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:42.393093 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.393064 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:42.393248 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.393186 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:42.393248 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.393205 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:42.393248 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.393217 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:42.393399 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.393279 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:58.39325942 +0000 UTC m=+33.364844411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:42.601337 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.601308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:42.601904 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:42.601319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:42.601904 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.601413 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:42.601904 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:42.601501 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:43.601074 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:43.601039 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:43.601255 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:43.601179 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:44.601748 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:44.601713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:44.602228 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:44.601714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:44.602228 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:44.601831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:44.602228 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:44.601895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:44.814995 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:44.814965 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:44.815157 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:44.815084 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:27:44.815633 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:44.815610 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ks4qt" Apr 16 20:27:45.602683 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:45.602647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:45.603116 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:45.602763 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:46.189679 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.189654 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:27:46.549205 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.548858 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:27:46.189675145Z","UUID":"1a579520-8521-4631-a94f-611c56026941","Handler":null,"Name":"","Endpoint":""} Apr 16 20:27:46.550628 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.550596 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:27:46.550628 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.550627 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:27:46.600809 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.600781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:46.600918 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.600788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:46.600918 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:46.600874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:46.600995 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:46.600957 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:46.766619 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"24cd88a36827fdec5bf5945aeb9e2630b975c876a27eb1da1d81e84de10334ed"} Apr 16 20:27:46.766619 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"731ef6b2265fd7505c3e001f7da9e0e35eeaaf329c2b65e174932d6f6cc01f7e"} Apr 16 20:27:46.767094 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766628 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"eb583b59c9f6cfd84f3f4326d7874540fbabb1de04d3c914db729c65d0614cbb"} Apr 16 20:27:46.767094 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"71e9b8b5cfbd305bbefe7893f19c1e5bcdd5247a29568cba45695166bf7f7e1a"} Apr 16 20:27:46.767094 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"2be3bffd58677efaeceff4aca2371ed9cfcefc58c1654e37f3b132b4fbc5e2d9"} Apr 16 20:27:46.767094 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.766654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"eef9e93a2d49be250fa348b2f0ff95a159a75d711703ff84759e0ea35521bdf2"} Apr 16 20:27:46.768809 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.768778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8v2qr" event={"ID":"8ab3b913-84db-4ba1-b661-7f31e898d4c1","Type":"ContainerStarted","Data":"9240dfb08420a048a67666a6f8c652e1bd30ef0920ef655959057c3367607626"} Apr 16 20:27:46.770629 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.770598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" event={"ID":"facc1f83-664e-47d0-a625-cb11e5043a9e","Type":"ContainerStarted","Data":"943313b6a5e7d1455c38db7874de969ba93d1e50544d0a9d914649b63020e576"} Apr 16 20:27:46.773050 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.773029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" event={"ID":"3d2a2667ea6d6ce4fbc4ca25ae7466fd","Type":"ContainerStarted","Data":"cb1be8bec0dfd910ab379dddf16db4b5b00d7aa766f2111eff95711807326625"} Apr 16 20:27:46.783265 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:46.783207 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8v2qr" podStartSLOduration=2.9556366990000003 podStartE2EDuration="21.783189582s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.874659123 +0000 UTC m=+1.846244090" lastFinishedPulling="2026-04-16 20:27:45.70221199 +0000 UTC m=+20.673796973" observedRunningTime="2026-04-16 20:27:46.782589693 +0000 UTC m=+21.754174686" watchObservedRunningTime="2026-04-16 20:27:46.783189582 +0000 UTC m=+21.754774573" Apr 16 20:27:47.601359 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:47.601327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:47.601546 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:47.601456 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:47.777024 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:47.776985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" event={"ID":"facc1f83-664e-47d0-a625-cb11e5043a9e","Type":"ContainerStarted","Data":"d79f5c53b5e25370777c06d4dae84e7f04724353d8aaf94257d7529382ca1cf9"} Apr 16 20:27:47.803231 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:47.803191 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zws8h" podStartSLOduration=2.456591244 podStartE2EDuration="22.803179955s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.833829445 +0000 UTC m=+1.805414414" lastFinishedPulling="2026-04-16 20:27:47.180418152 +0000 UTC m=+22.152003125" observedRunningTime="2026-04-16 20:27:47.802170165 +0000 UTC m=+22.773755154" watchObservedRunningTime="2026-04-16 20:27:47.803179955 +0000 UTC m=+22.774764944" Apr 16 20:27:47.803744 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:47.803723 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-53.ec2.internal" podStartSLOduration=21.803718776 podStartE2EDuration="21.803718776s" podCreationTimestamp="2026-04-16 20:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:46.794421509 +0000 UTC m=+21.766006514" watchObservedRunningTime="2026-04-16 20:27:47.803718776 +0000 UTC m=+22.775303769" Apr 16 20:27:48.600853 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:48.600828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:48.601016 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:48.600828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:48.601016 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:48.600923 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:48.601095 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:48.601014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:48.781820 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:48.781729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"a94b3da742493eacdd74f04477e381ef28ac04ef318554d65495576274ad26d4"} Apr 16 20:27:49.242789 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:49.242754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:49.242989 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:49.242873 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:49.242989 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:49.242971 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret podName:469d4528-df9f-44e0-a586-1d85f9f8a444 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:05.242930432 +0000 UTC m=+40.214515402 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret") pod "global-pull-secret-syncer-25wtt" (UID: "469d4528-df9f-44e0-a586-1d85f9f8a444") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:49.601124 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:49.601092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:49.601302 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:49.601237 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:50.601523 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.601494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:50.602007 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.601494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:50.602007 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:50.601626 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:50.602007 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:50.601699 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:50.789661 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.789635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" event={"ID":"e4de5fe5-38a8-4fca-b8aa-9b340811b2f5","Type":"ContainerStarted","Data":"36f01748d44ded0bcdaf2a85fb5aaf6db5b3ee64b06252d6b2c03944e5c10915"} Apr 16 20:27:50.790207 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.789983 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:50.790207 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.790025 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:50.805151 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.805116 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:50.815764 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:50.815724 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" podStartSLOduration=6.61606967 podStartE2EDuration="25.815711429s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:26.935069935 +0000 UTC m=+1.906654916" lastFinishedPulling="2026-04-16 20:27:46.13471169 +0000 UTC m=+21.106296675" observedRunningTime="2026-04-16 20:27:50.815222998 +0000 UTC m=+25.786807987" watchObservedRunningTime="2026-04-16 20:27:50.815711429 +0000 UTC m=+25.787296419" Apr 16 20:27:51.601565 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:51.601375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:51.602307 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:51.601633 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:51.792635 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:51.792605 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="ca9183024831b370ff5222cf495d96184b1bb87a566cf676d18fd30d19b14592" exitCode=0 Apr 16 20:27:51.792792 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:51.792685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"ca9183024831b370ff5222cf495d96184b1bb87a566cf676d18fd30d19b14592"} Apr 16 20:27:51.793239 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:51.793212 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:51.810004 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:51.808058 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:27:52.601348 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.601312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:52.601537 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:52.601446 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:52.601537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.601312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:52.601537 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:52.601529 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:52.753588 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.753559 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p2hmj"] Apr 16 20:27:52.757026 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.757005 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tjgd4"] Apr 16 20:27:52.757144 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.757128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:52.757274 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:52.757251 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:52.757721 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.757699 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-25wtt"] Apr 16 20:27:52.796794 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.796768 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="34cdb5ece3aef52c4c6853a4cf97b1718307ce0bc1217abd20d91826497a8d21" exitCode=0 Apr 16 20:27:52.796896 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.796857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:52.796896 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.796853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"34cdb5ece3aef52c4c6853a4cf97b1718307ce0bc1217abd20d91826497a8d21"} Apr 16 20:27:52.796896 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:52.796878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:52.797069 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:52.796963 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:52.797119 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:52.797056 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:53.800095 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:53.800036 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="a0b82e712648aa54ad219757d9e5627a241de1234db052f72eb2d9ef3cc82790" exitCode=0 Apr 16 20:27:53.800371 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:53.800106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"a0b82e712648aa54ad219757d9e5627a241de1234db052f72eb2d9ef3cc82790"} Apr 16 20:27:54.601375 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:54.601345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:54.601515 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:54.601456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:54.601515 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:54.601462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:54.601515 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:54.601504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:54.601687 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:54.601596 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:54.601687 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:54.601675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:56.601356 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:56.601121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:56.601773 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:56.601121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:56.601773 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:56.601467 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-25wtt" podUID="469d4528-df9f-44e0-a586-1d85f9f8a444" Apr 16 20:27:56.601773 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:56.601121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:56.601773 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:56.601578 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p2hmj" podUID="47eb8a0b-d7e2-4e86-a199-e21ebcaeb743" Apr 16 20:27:56.601773 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:56.601639 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:27:58.314097 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.314057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:58.314453 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.314200 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:58.314453 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.314259 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.314244348 +0000 UTC m=+65.285829315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:58.371314 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.371286 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-53.ec2.internal" event="NodeReady" Apr 16 20:27:58.371493 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.371418 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:27:58.402542 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.402468 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb"] Apr 16 20:27:58.414670 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.414637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:58.414821 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.414779 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:58.414821 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.414798 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:58.414821 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.414808 2578 projected.go:194] Error preparing data for projected volume kube-api-access-7hp8m for pod openshift-network-diagnostics/network-check-target-p2hmj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:58.414937 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.414854 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m podName:47eb8a0b-d7e2-4e86-a199-e21ebcaeb743 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.414840575 +0000 UTC m=+65.386425563 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7hp8m" (UniqueName: "kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m") pod "network-check-target-p2hmj" (UID: "47eb8a0b-d7e2-4e86-a199-e21ebcaeb743") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:58.430663 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.430217 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs"] Apr 16 20:27:58.445738 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.445712 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:27:58.445868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.445747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.445868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.445747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.448937 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.448911 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:27:58.449125 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.449105 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:27:58.449440 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.449416 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:27:58.457276 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.457256 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb"] Apr 16 20:27:58.457423 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.457407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.458725 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.458683 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5wwv5\"" Apr 16 20:27:58.458812 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.458733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:27:58.458877 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.458683 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 20:27:58.458917 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.458894 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:27:58.459200 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.459183 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 20:27:58.459384 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.459369 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 20:27:58.459582 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.459565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 20:27:58.462276 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.462260 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:27:58.462276 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.462276 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:27:58.462400 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.462261 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:27:58.462400 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.462337 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tj6tv\"" Apr 16 20:27:58.468922 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.468904 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:27:58.469813 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.469798 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc"] Apr 16 20:27:58.469956 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.469929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.472305 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.472286 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-nzp4d\"" Apr 16 20:27:58.472386 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.472328 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:27:58.484084 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.484067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb"] Apr 16 20:27:58.484172 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.484088 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fn9dd"] Apr 16 20:27:58.484234 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.484196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.486330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.486312 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 20:27:58.500045 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.500024 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s64xd"] Apr 16 20:27:58.500183 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.500168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.503810 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.503790 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:27:58.503885 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.503837 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:27:58.503885 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.503792 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:27:58.504027 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.503878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:27:58.510061 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc"] Apr 16 20:27:58.510134 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510065 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb"] Apr 16 20:27:58.510134 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:27:58.510206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fn9dd"] Apr 16 20:27:58.510206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs"] Apr 16 20:27:58.510206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510170 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s64xd"] Apr 16 20:27:58.510206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.510144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.512394 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.512378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:27:58.512394 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.512388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:27:58.512531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.512411 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:27:58.515283 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515376 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515376 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515376 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9skl\" (UniqueName: \"kubernetes.io/projected/84e6f960-cccb-497f-b46e-31abf5235826-kube-api-access-l9skl\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35d37cfc-3882-4be0-ac36-d1be745ae717-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.515531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84e6f960-cccb-497f-b46e-31abf5235826-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515749 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515749 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.515749 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515749 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515907 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkqc\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.515907 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.516026 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.515973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.601683 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.601645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:27:58.601868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.601645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:27:58.602005 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.601656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:27:58.604133 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604113 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:27:58.604257 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:27:58.604360 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:27:58.604419 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604356 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:27:58.604475 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604464 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:27:58.604475 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.604470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:27:58.616381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.616494 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.616494 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84e6f960-cccb-497f-b46e-31abf5235826-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.616494 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/05b656ef-6635-42f2-8409-fb9d474e0da8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.616494 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrjb\" (UniqueName: \"kubernetes.io/projected/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-kube-api-access-bxrjb\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.616699 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfwmq\" (UniqueName: \"kubernetes.io/projected/05b656ef-6635-42f2-8409-fb9d474e0da8-kube-api-access-kfwmq\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.616699 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.616699 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.616699 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8171246b-7193-44d0-9905-129be29085dd-tmp-dir\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.616882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05b656ef-6635-42f2-8409-fb9d474e0da8-tmp\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.616882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.616882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.616882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.616888 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.616903 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9skl\" (UniqueName: \"kubernetes.io/projected/84e6f960-cccb-497f-b46e-31abf5235826-kube-api-access-l9skl\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.616962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35d37cfc-3882-4be0-ac36-d1be745ae717-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.616981 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:27:59.11696305 +0000 UTC m=+34.088548021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8171246b-7193-44d0-9905-129be29085dd-config-volume\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617074 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7jqv\" (UniqueName: \"kubernetes.io/projected/8171246b-7193-44d0-9905-129be29085dd-kube-api-access-v7jqv\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.617103 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnn5\" (UniqueName: \"kubernetes.io/projected/45d5c33b-d7b8-47d4-a44d-4e9b49120004-kube-api-access-cbnn5\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84e6f960-cccb-497f-b46e-31abf5235826-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkqc\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.617563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.617380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.618116 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.618091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.618209 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.618184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35d37cfc-3882-4be0-ac36-d1be745ae717-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:58.618268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.618226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.618931 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.618464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.618931 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.618756 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:27:58.618931 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.618821 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:59.118804114 +0000 UTC m=+34.090389085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:27:58.622444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.622444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-ca\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.622444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.622591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.622591 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.622834 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.622814 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84e6f960-cccb-497f-b46e-31abf5235826-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.627829 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.627802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9skl\" (UniqueName: \"kubernetes.io/projected/84e6f960-cccb-497f-b46e-31abf5235826-kube-api-access-l9skl\") pod \"cluster-proxy-proxy-agent-6c976f769d-jk2zs\" (UID: \"84e6f960-cccb-497f-b46e-31abf5235826\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.627912 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.627844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.627912 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.627888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkqc\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:58.719486 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.719486 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/05b656ef-6635-42f2-8409-fb9d474e0da8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.719718 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrjb\" (UniqueName: \"kubernetes.io/projected/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-kube-api-access-bxrjb\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.719718 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfwmq\" (UniqueName: \"kubernetes.io/projected/05b656ef-6635-42f2-8409-fb9d474e0da8-kube-api-access-kfwmq\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.719836 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8171246b-7193-44d0-9905-129be29085dd-tmp-dir\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.719836 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05b656ef-6635-42f2-8409-fb9d474e0da8-tmp\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.719932 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8171246b-7193-44d0-9905-129be29085dd-config-volume\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.719932 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.719932 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7jqv\" (UniqueName: \"kubernetes.io/projected/8171246b-7193-44d0-9905-129be29085dd-kube-api-access-v7jqv\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.719932 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnn5\" (UniqueName: \"kubernetes.io/projected/45d5c33b-d7b8-47d4-a44d-4e9b49120004-kube-api-access-cbnn5\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.720166 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.719977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.720166 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.720108 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:27:58.720166 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.720165 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:27:59.22015008 +0000 UTC m=+34.191735052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:27:58.720315 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.720163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8171246b-7193-44d0-9905-129be29085dd-tmp-dir\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.720415 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.720396 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:27:58.720452 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.720433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8171246b-7193-44d0-9905-129be29085dd-config-volume\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.720489 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:58.720456 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:59.220437655 +0000 UTC m=+34.192022638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:27:58.720668 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.720646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05b656ef-6635-42f2-8409-fb9d474e0da8-tmp\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.722199 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.722184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.722268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.722249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/05b656ef-6635-42f2-8409-fb9d474e0da8-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.728838 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.728815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnn5\" (UniqueName: \"kubernetes.io/projected/45d5c33b-d7b8-47d4-a44d-4e9b49120004-kube-api-access-cbnn5\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:58.729200 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.729179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfwmq\" (UniqueName: \"kubernetes.io/projected/05b656ef-6635-42f2-8409-fb9d474e0da8-kube-api-access-kfwmq\") pod \"klusterlet-addon-workmgr-7fc647b44-mp4kc\" (UID: \"05b656ef-6635-42f2-8409-fb9d474e0da8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.729200 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.729191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7jqv\" (UniqueName: \"kubernetes.io/projected/8171246b-7193-44d0-9905-129be29085dd-kube-api-access-v7jqv\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:58.729309 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.729242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrjb\" (UniqueName: \"kubernetes.io/projected/af6f67a2-6abe-4f0c-a623-4864b0fb47b4-kube-api-access-bxrjb\") pod \"managed-serviceaccount-addon-agent-8579c7495c-x4ncb\" (UID: \"af6f67a2-6abe-4f0c-a623-4864b0fb47b4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.767316 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.767293 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:27:58.785418 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.785385 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" Apr 16 20:27:58.801110 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.801082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:27:58.948555 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.948344 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb"] Apr 16 20:27:58.953842 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:58.953696 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf6f67a2_6abe_4f0c_a623_4864b0fb47b4.slice/crio-9ef53243d0da8b1143bf19b0a726304830c7849625a1d72f003a51fcc7c4fc49 WatchSource:0}: Error finding container 9ef53243d0da8b1143bf19b0a726304830c7849625a1d72f003a51fcc7c4fc49: Status 404 returned error can't find the container with id 9ef53243d0da8b1143bf19b0a726304830c7849625a1d72f003a51fcc7c4fc49 Apr 16 20:27:58.959303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.959199 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs"] Apr 16 20:27:58.961412 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:58.961386 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e6f960_cccb_497f_b46e_31abf5235826.slice/crio-61ce0cd98856b4293498933f2667d2bd19f679cb804492075e61a505987d8606 WatchSource:0}: Error finding container 61ce0cd98856b4293498933f2667d2bd19f679cb804492075e61a505987d8606: Status 404 returned error can't find the container with id 61ce0cd98856b4293498933f2667d2bd19f679cb804492075e61a505987d8606 Apr 16 20:27:58.963690 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:58.963668 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc"] Apr 16 20:27:58.966355 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:27:58.966330 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b656ef_6635_42f2_8409_fb9d474e0da8.slice/crio-f786bf7f859990e0f14024293d2a46b589fc44bf3f5db130c6e668b35c663a21 WatchSource:0}: Error finding container f786bf7f859990e0f14024293d2a46b589fc44bf3f5db130c6e668b35c663a21: Status 404 returned error can't find the container with id f786bf7f859990e0f14024293d2a46b589fc44bf3f5db130c6e668b35c663a21 Apr 16 20:27:59.124818 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.124776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:27:59.125015 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.124971 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:27:59.125093 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.125027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:27:59.125093 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.125061 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:00.125040907 +0000 UTC m=+35.096625880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:27:59.125244 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.125227 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:27:59.125302 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.125247 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:27:59.125302 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.125297 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:00.125280695 +0000 UTC m=+35.096865663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:27:59.226050 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.226019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:27:59.226209 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.226133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:27:59.226209 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.226173 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:27:59.226303 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.226228 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:27:59.226303 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.226242 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:28:00.226223922 +0000 UTC m=+35.197808890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:27:59.226303 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:27:59.226275 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:00.22625836 +0000 UTC m=+35.197843354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:27:59.817909 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.817838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" event={"ID":"af6f67a2-6abe-4f0c-a623-4864b0fb47b4","Type":"ContainerStarted","Data":"9ef53243d0da8b1143bf19b0a726304830c7849625a1d72f003a51fcc7c4fc49"} Apr 16 20:27:59.820209 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.820097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" event={"ID":"05b656ef-6635-42f2-8409-fb9d474e0da8","Type":"ContainerStarted","Data":"f786bf7f859990e0f14024293d2a46b589fc44bf3f5db130c6e668b35c663a21"} Apr 16 20:27:59.821678 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:27:59.821653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerStarted","Data":"61ce0cd98856b4293498933f2667d2bd19f679cb804492075e61a505987d8606"} Apr 16 20:28:00.135816 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:00.135781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:00.135877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.135988 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.136012 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.136041 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.136083 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:02.136059591 +0000 UTC m=+37.107644565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:28:00.136111 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.136104 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:02.136093611 +0000 UTC m=+37.107678586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:28:00.236839 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:00.236797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:28:00.237085 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:00.236980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:28:00.237156 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.237122 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:00.237204 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.237184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:02.237165717 +0000 UTC m=+37.208750691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:28:00.237389 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.237371 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:00.237454 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:00.237425 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:28:02.23740756 +0000 UTC m=+37.208992542 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:28:02.155086 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:02.155050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:02.155139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.155210 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.155228 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.155240 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.155288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.155271631 +0000 UTC m=+41.126856599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:28:02.155424 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.155303 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.155296016 +0000 UTC m=+41.126880984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:28:02.255471 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:02.255447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:28:02.255611 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:02.255485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:28:02.255611 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.255586 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:02.255683 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.255640 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.255625499 +0000 UTC m=+41.227210467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:28:02.255683 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.255647 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:02.255683 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:02.255681 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.255670937 +0000 UTC m=+41.227255925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:28:05.281458 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:05.281417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:28:05.285316 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:05.285287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/469d4528-df9f-44e0-a586-1d85f9f8a444-original-pull-secret\") pod \"global-pull-secret-syncer-25wtt\" (UID: \"469d4528-df9f-44e0-a586-1d85f9f8a444\") " pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:28:05.542270 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:05.542196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-25wtt" Apr 16 20:28:06.186959 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.186916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:28:06.187093 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.187000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:28:06.187093 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.187085 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:28:06.187160 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.187100 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:28:06.187160 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.187107 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:28:06.187160 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.187153 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:14.187138646 +0000 UTC m=+49.158723634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:28:06.187250 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.187167 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:14.187161727 +0000 UTC m=+49.158746695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:28:06.287881 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.287860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:28:06.288174 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.287896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:28:06.288174 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.288008 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:06.288174 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.288030 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:06.288174 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.288055 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:14.288043242 +0000 UTC m=+49.259628214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:28:06.288174 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:06.288069 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:28:14.288062805 +0000 UTC m=+49.259647772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:28:06.693892 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.693694 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-25wtt"] Apr 16 20:28:06.705131 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:28:06.705105 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469d4528_df9f_44e0_a586_1d85f9f8a444.slice/crio-e8c1d0439b6e725483bdd58d98e4c5156266210dd586de6f06603c237b52cafa WatchSource:0}: Error finding container e8c1d0439b6e725483bdd58d98e4c5156266210dd586de6f06603c237b52cafa: Status 404 returned error can't find the container with id e8c1d0439b6e725483bdd58d98e4c5156266210dd586de6f06603c237b52cafa Apr 16 20:28:06.839390 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.839358 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerStarted","Data":"eecdadfd6d41d8aadba25470cefcb992af36ed075ebac27a8b68b6558f6db50d"} Apr 16 20:28:06.840610 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.840579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerStarted","Data":"ef28aa867e260a88a2cbb9eb293fd9a66047033940316e8e6152f6b9e896928b"} Apr 16 20:28:06.841763 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.841735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" event={"ID":"af6f67a2-6abe-4f0c-a623-4864b0fb47b4","Type":"ContainerStarted","Data":"f078567745cefaaa68c41bcbceab0dcd01a4498759dbc8a2c77d0be14cc336d7"} Apr 16 20:28:06.842750 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.842731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-25wtt" event={"ID":"469d4528-df9f-44e0-a586-1d85f9f8a444","Type":"ContainerStarted","Data":"e8c1d0439b6e725483bdd58d98e4c5156266210dd586de6f06603c237b52cafa"} Apr 16 20:28:06.843958 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.843922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" event={"ID":"05b656ef-6635-42f2-8409-fb9d474e0da8","Type":"ContainerStarted","Data":"7c642a09ef9ed663e163b605189ef030cb5f581ad2246ec96eae7fe5b3337d97"} Apr 16 20:28:06.844192 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.844177 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:28:06.845606 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.845589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:28:06.872736 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.872698 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" podStartSLOduration=1.263511147 podStartE2EDuration="8.872687958s" podCreationTimestamp="2026-04-16 20:27:58 +0000 UTC" firstStartedPulling="2026-04-16 20:27:58.968122323 +0000 UTC m=+33.939707298" lastFinishedPulling="2026-04-16 20:28:06.57729914 +0000 UTC m=+41.548884109" observedRunningTime="2026-04-16 20:28:06.872154093 +0000 UTC m=+41.843739086" watchObservedRunningTime="2026-04-16 20:28:06.872687958 +0000 UTC m=+41.844272948" Apr 16 20:28:06.904093 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:06.904056 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" podStartSLOduration=1.295424018 podStartE2EDuration="8.904042849s" podCreationTimestamp="2026-04-16 20:27:58 +0000 UTC" firstStartedPulling="2026-04-16 20:27:58.956633424 +0000 UTC m=+33.928218392" lastFinishedPulling="2026-04-16 20:28:06.565252251 +0000 UTC m=+41.536837223" observedRunningTime="2026-04-16 20:28:06.90325378 +0000 UTC m=+41.874838770" watchObservedRunningTime="2026-04-16 20:28:06.904042849 +0000 UTC m=+41.875627833" Apr 16 20:28:07.848559 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:07.848527 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="eecdadfd6d41d8aadba25470cefcb992af36ed075ebac27a8b68b6558f6db50d" exitCode=0 Apr 16 20:28:07.849330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:07.848614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"eecdadfd6d41d8aadba25470cefcb992af36ed075ebac27a8b68b6558f6db50d"} Apr 16 20:28:08.854021 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:08.853986 2578 generic.go:358] "Generic (PLEG): container finished" podID="151f9e87-8756-4668-8ff7-d2417f6d4658" containerID="b73c55f3ab04ec7ab184c27b6ce4e52f73a691db77289ce45471d3bbe8f0c2e6" exitCode=0 Apr 16 20:28:08.854602 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:08.854063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerDied","Data":"b73c55f3ab04ec7ab184c27b6ce4e52f73a691db77289ce45471d3bbe8f0c2e6"} Apr 16 20:28:09.859956 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:09.859719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" event={"ID":"151f9e87-8756-4668-8ff7-d2417f6d4658","Type":"ContainerStarted","Data":"13a4a408b48ee20531de09f8535fd61d10b83ea636abba72d426088ae15fd31a"} Apr 16 20:28:09.861647 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:09.861618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerStarted","Data":"8e0ce359b14a40e23d0536e69caa44f193be80397f5e0bd0fef6cdf610603bba"} Apr 16 20:28:09.861647 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:09.861651 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerStarted","Data":"d45163d141ced5cbcff55037b5725a63f36ce4d292fd0005bf6410299f40fc48"} Apr 16 20:28:09.883573 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:09.883515 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v5lrp" podStartSLOduration=5.956560709 podStartE2EDuration="44.883497261s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:27:27.014468914 +0000 UTC m=+1.986053887" lastFinishedPulling="2026-04-16 20:28:05.941405465 +0000 UTC m=+40.912990439" observedRunningTime="2026-04-16 20:28:09.881650912 +0000 UTC m=+44.853235906" watchObservedRunningTime="2026-04-16 20:28:09.883497261 +0000 UTC m=+44.855082265" Apr 16 20:28:09.898697 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:09.898640 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" podStartSLOduration=1.6609461140000001 podStartE2EDuration="11.898620233s" podCreationTimestamp="2026-04-16 20:27:58 +0000 UTC" firstStartedPulling="2026-04-16 20:27:58.963786483 +0000 UTC m=+33.935371454" lastFinishedPulling="2026-04-16 20:28:09.201460591 +0000 UTC m=+44.173045573" observedRunningTime="2026-04-16 20:28:09.898234813 +0000 UTC m=+44.869819804" watchObservedRunningTime="2026-04-16 20:28:09.898620233 +0000 UTC m=+44.870205246" Apr 16 20:28:11.869741 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:11.869699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-25wtt" event={"ID":"469d4528-df9f-44e0-a586-1d85f9f8a444","Type":"ContainerStarted","Data":"31d364587153c7842e573df5c1add8b25e336589c84ecbf103c8ddd4970b8c8e"} Apr 16 20:28:11.884740 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:11.884683 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-25wtt" podStartSLOduration=34.031492477 podStartE2EDuration="38.884665992s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:28:06.707397686 +0000 UTC m=+41.678982657" lastFinishedPulling="2026-04-16 20:28:11.560571191 +0000 UTC m=+46.532156172" observedRunningTime="2026-04-16 20:28:11.883321759 +0000 UTC m=+46.854906752" watchObservedRunningTime="2026-04-16 20:28:11.884665992 +0000 UTC m=+46.856250983" Apr 16 20:28:14.255892 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:14.255848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:14.255973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.256030 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.256093 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.256105 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.256108 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.256088562 +0000 UTC m=+65.227673545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:28:14.256322 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.256137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.256127294 +0000 UTC m=+65.227712266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:28:14.356536 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:14.356496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:28:14.356711 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:14.356553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:28:14.356711 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.356646 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:14.356711 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.356689 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:14.356811 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.356723 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.356703714 +0000 UTC m=+65.328288687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:28:14.356811 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:14.356740 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:28:30.35673355 +0000 UTC m=+65.328318517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:28:23.812465 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:23.812436 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsr25" Apr 16 20:28:30.274870 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.274834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.274922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.275001 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.275019 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.275052 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.275078 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:29:02.275062882 +0000 UTC m=+97.246647872 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:28:30.275273 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.275110 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:02.275097182 +0000 UTC m=+97.246682168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:28:30.375510 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.375470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:28:30.375679 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.375541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:28:30.375679 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.375630 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:30.375750 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.375681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:28:30.375750 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.375703 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:29:02.375686789 +0000 UTC m=+97.347271756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:28:30.375750 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.375744 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:30.375867 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.375794 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:02.375780492 +0000 UTC m=+97.347365464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:28:30.378136 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.378091 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:28:30.386603 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.386573 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:28:30.386671 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:28:30.386643 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:29:34.3866284 +0000 UTC m=+129.358213381 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : secret "metrics-daemon-secret" not found Apr 16 20:28:30.476899 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.476862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:28:30.479758 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.479734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:28:30.490803 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.490779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:28:30.502676 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.502650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hp8m\" (UniqueName: \"kubernetes.io/projected/47eb8a0b-d7e2-4e86-a199-e21ebcaeb743-kube-api-access-7hp8m\") pod \"network-check-target-p2hmj\" (UID: \"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743\") " pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:28:30.723057 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.723026 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:28:30.731085 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.731064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:28:30.847670 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.847636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p2hmj"] Apr 16 20:28:30.851475 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:28:30.851439 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eb8a0b_d7e2_4e86_a199_e21ebcaeb743.slice/crio-d6e159384947146ce5e96ac75a6e365c0eab5017b0345dd8e69befb1144a7735 WatchSource:0}: Error finding container d6e159384947146ce5e96ac75a6e365c0eab5017b0345dd8e69befb1144a7735: Status 404 returned error can't find the container with id d6e159384947146ce5e96ac75a6e365c0eab5017b0345dd8e69befb1144a7735 Apr 16 20:28:30.915340 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:30.915302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p2hmj" event={"ID":"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743","Type":"ContainerStarted","Data":"d6e159384947146ce5e96ac75a6e365c0eab5017b0345dd8e69befb1144a7735"} Apr 16 20:28:34.926758 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:34.926722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p2hmj" event={"ID":"47eb8a0b-d7e2-4e86-a199-e21ebcaeb743","Type":"ContainerStarted","Data":"40aaaaa483ff28826047f19d28c5236e77598c0bd95a4557d06e48a30a1fc95c"} Apr 16 20:28:34.927187 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:34.926891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:28:34.941618 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:28:34.941578 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-p2hmj" podStartSLOduration=66.92376856 podStartE2EDuration="1m9.941565013s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:28:30.853316336 +0000 UTC m=+65.824901304" lastFinishedPulling="2026-04-16 20:28:33.871112776 +0000 UTC m=+68.842697757" observedRunningTime="2026-04-16 20:28:34.941095241 +0000 UTC m=+69.912680232" watchObservedRunningTime="2026-04-16 20:28:34.941565013 +0000 UTC m=+69.913150002" Apr 16 20:29:02.304486 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:02.304450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:02.304512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.304618 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.304637 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.304664 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9548d788-k476v: secret "image-registry-tls" not found Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.304676 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.304660423 +0000 UTC m=+161.276245391 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:29:02.304933 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.304750 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls podName:c3360e56-bb80-45fb-aa31-ab10e294e01c nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.304725949 +0000 UTC m=+161.276310947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls") pod "image-registry-5c9548d788-k476v" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c") : secret "image-registry-tls" not found Apr 16 20:29:02.405026 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:02.405004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:29:02.405120 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:02.405042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:29:02.405160 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.405133 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:29:02.405160 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.405152 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:29:02.405224 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.405190 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls podName:8171246b-7193-44d0-9905-129be29085dd nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.405180866 +0000 UTC m=+161.376765835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls") pod "dns-default-s64xd" (UID: "8171246b-7193-44d0-9905-129be29085dd") : secret "dns-default-metrics-tls" not found Apr 16 20:29:02.405224 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:02.405203 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert podName:45d5c33b-d7b8-47d4-a44d-4e9b49120004 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.405197017 +0000 UTC m=+161.376781984 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert") pod "ingress-canary-fn9dd" (UID: "45d5c33b-d7b8-47d4-a44d-4e9b49120004") : secret "canary-serving-cert" not found Apr 16 20:29:05.931539 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:05.931504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-p2hmj" Apr 16 20:29:34.432760 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:34.432722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:29:34.433312 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:34.432879 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:29:34.433312 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:29:34.432975 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs podName:931ff401-6150-4e87-828a-2e3a9242e1bc nodeName:}" failed. No retries permitted until 2026-04-16 20:31:36.432932203 +0000 UTC m=+251.404517172 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs") pod "network-metrics-daemon-tjgd4" (UID: "931ff401-6150-4e87-828a-2e3a9242e1bc") : secret "metrics-daemon-secret" not found Apr 16 20:29:57.061768 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:57.061733 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ltdrj_10df2057-e291-4486-a0e1-89f26c4dbee9/dns-node-resolver/0.log" Apr 16 20:29:57.661788 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:29:57.661761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l6t5h_ae461ccf-29e2-4d21-9a23-605ab7aac065/node-ca/0.log" Apr 16 20:30:01.473556 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:01.473514 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" podUID="35d37cfc-3882-4be0-ac36-d1be745ae717" Apr 16 20:30:01.479689 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:01.479665 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5c9548d788-k476v" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" Apr 16 20:30:01.507787 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:01.507759 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fn9dd" podUID="45d5c33b-d7b8-47d4-a44d-4e9b49120004" Apr 16 20:30:01.520072 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:01.520053 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s64xd" podUID="8171246b-7193-44d0-9905-129be29085dd" Apr 16 20:30:01.612779 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:01.612749 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tjgd4" podUID="931ff401-6150-4e87-828a-2e3a9242e1bc" Apr 16 20:30:02.142719 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:02.142694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:02.142719 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:02.142709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:30:02.142916 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:02.142705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:30:02.142916 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:02.142695 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:06.363164 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.363132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:06.363650 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.363282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:30:06.363650 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:06.363412 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:30:06.363650 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:06.363489 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert podName:35d37cfc-3882-4be0-ac36-d1be745ae717 nodeName:}" failed. No retries permitted until 2026-04-16 20:32:08.363469709 +0000 UTC m=+283.335054678 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8s6gb" (UID: "35d37cfc-3882-4be0-ac36-d1be745ae717") : secret "networking-console-plugin-cert" not found Apr 16 20:30:06.365529 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.365506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"image-registry-5c9548d788-k476v\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:06.464445 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.464420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:30:06.464563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.464459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:06.466681 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.466655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45d5c33b-d7b8-47d4-a44d-4e9b49120004-cert\") pod \"ingress-canary-fn9dd\" (UID: \"45d5c33b-d7b8-47d4-a44d-4e9b49120004\") " pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:30:06.466803 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.466678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8171246b-7193-44d0-9905-129be29085dd-metrics-tls\") pod \"dns-default-s64xd\" (UID: \"8171246b-7193-44d0-9905-129be29085dd\") " pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:06.646936 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.646885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:30:06.647051 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.646889 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tj6tv\"" Apr 16 20:30:06.647051 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.646891 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:30:06.653982 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.653966 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:06.654040 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.653984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:06.654040 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.654000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fn9dd" Apr 16 20:30:06.795310 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.795286 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:30:06.798053 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:30:06.798015 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3360e56_bb80_45fb_aa31_ab10e294e01c.slice/crio-eaf338d8f63360744b84aeb33ec8dcd317888dc847b8d77243637a8ed2b86951 WatchSource:0}: Error finding container eaf338d8f63360744b84aeb33ec8dcd317888dc847b8d77243637a8ed2b86951: Status 404 returned error can't find the container with id eaf338d8f63360744b84aeb33ec8dcd317888dc847b8d77243637a8ed2b86951 Apr 16 20:30:06.810820 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.810694 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s64xd"] Apr 16 20:30:06.814885 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:30:06.814854 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8171246b_7193_44d0_9905_129be29085dd.slice/crio-3bd30cfd54f51ba4c70990585eed9cafedff05025d71612cb8d126d71b048072 WatchSource:0}: Error finding container 3bd30cfd54f51ba4c70990585eed9cafedff05025d71612cb8d126d71b048072: Status 404 returned error can't find the container with id 3bd30cfd54f51ba4c70990585eed9cafedff05025d71612cb8d126d71b048072 Apr 16 20:30:06.826549 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:06.826526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fn9dd"] Apr 16 20:30:06.839754 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:30:06.839734 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d5c33b_d7b8_47d4_a44d_4e9b49120004.slice/crio-a03c105e9de4e862d1c41f2b08fce597094ed8b0f0567c37930b8adbd5ecf54e WatchSource:0}: Error finding container a03c105e9de4e862d1c41f2b08fce597094ed8b0f0567c37930b8adbd5ecf54e: Status 404 returned error can't find the container with id a03c105e9de4e862d1c41f2b08fce597094ed8b0f0567c37930b8adbd5ecf54e Apr 16 20:30:07.156933 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.156896 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s64xd" event={"ID":"8171246b-7193-44d0-9905-129be29085dd","Type":"ContainerStarted","Data":"3bd30cfd54f51ba4c70990585eed9cafedff05025d71612cb8d126d71b048072"} Apr 16 20:30:07.158106 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.158081 2578 generic.go:358] "Generic (PLEG): container finished" podID="af6f67a2-6abe-4f0c-a623-4864b0fb47b4" containerID="f078567745cefaaa68c41bcbceab0dcd01a4498759dbc8a2c77d0be14cc336d7" exitCode=255 Apr 16 20:30:07.158232 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.158141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" event={"ID":"af6f67a2-6abe-4f0c-a623-4864b0fb47b4","Type":"ContainerDied","Data":"f078567745cefaaa68c41bcbceab0dcd01a4498759dbc8a2c77d0be14cc336d7"} Apr 16 20:30:07.158480 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.158454 2578 scope.go:117] "RemoveContainer" containerID="f078567745cefaaa68c41bcbceab0dcd01a4498759dbc8a2c77d0be14cc336d7" Apr 16 20:30:07.159536 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.159498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9548d788-k476v" event={"ID":"c3360e56-bb80-45fb-aa31-ab10e294e01c","Type":"ContainerStarted","Data":"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3"} Apr 16 20:30:07.159536 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.159530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9548d788-k476v" event={"ID":"c3360e56-bb80-45fb-aa31-ab10e294e01c","Type":"ContainerStarted","Data":"eaf338d8f63360744b84aeb33ec8dcd317888dc847b8d77243637a8ed2b86951"} Apr 16 20:30:07.159694 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.159651 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:07.160561 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.160537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fn9dd" event={"ID":"45d5c33b-d7b8-47d4-a44d-4e9b49120004","Type":"ContainerStarted","Data":"a03c105e9de4e862d1c41f2b08fce597094ed8b0f0567c37930b8adbd5ecf54e"} Apr 16 20:30:07.161885 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.161867 2578 generic.go:358] "Generic (PLEG): container finished" podID="05b656ef-6635-42f2-8409-fb9d474e0da8" containerID="7c642a09ef9ed663e163b605189ef030cb5f581ad2246ec96eae7fe5b3337d97" exitCode=1 Apr 16 20:30:07.162007 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.161907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" event={"ID":"05b656ef-6635-42f2-8409-fb9d474e0da8","Type":"ContainerDied","Data":"7c642a09ef9ed663e163b605189ef030cb5f581ad2246ec96eae7fe5b3337d97"} Apr 16 20:30:07.162208 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.162193 2578 scope.go:117] "RemoveContainer" containerID="7c642a09ef9ed663e163b605189ef030cb5f581ad2246ec96eae7fe5b3337d97" Apr 16 20:30:07.209623 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:07.209577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c9548d788-k476v" podStartSLOduration=161.209559747 podStartE2EDuration="2m41.209559747s" podCreationTimestamp="2026-04-16 20:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:07.208684323 +0000 UTC m=+162.180269314" watchObservedRunningTime="2026-04-16 20:30:07.209559747 +0000 UTC m=+162.181144737" Apr 16 20:30:08.166770 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:08.166734 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8579c7495c-x4ncb" event={"ID":"af6f67a2-6abe-4f0c-a623-4864b0fb47b4","Type":"ContainerStarted","Data":"cc1aebe1fcd7130b6a09dc19b7b66a071d3cc8b9357cf071b0afc6c65dc00cf3"} Apr 16 20:30:08.168520 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:08.168492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" event={"ID":"05b656ef-6635-42f2-8409-fb9d474e0da8","Type":"ContainerStarted","Data":"946b727d395899f1f813147b3c4fac8b147063312618f093f9d733984b5a574f"} Apr 16 20:30:08.168882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:08.168864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:30:08.169541 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:08.169519 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc647b44-mp4kc" Apr 16 20:30:09.175870 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:09.175833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s64xd" event={"ID":"8171246b-7193-44d0-9905-129be29085dd","Type":"ContainerStarted","Data":"4795f31aaa2e97567ef5ac159bced657e57768f0e889388f9948f0d7d50ff167"} Apr 16 20:30:09.178403 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:09.178374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fn9dd" event={"ID":"45d5c33b-d7b8-47d4-a44d-4e9b49120004","Type":"ContainerStarted","Data":"8a21c2bf919dd5429a6b3d210e260e01022561f6dd20667c03fc02debff0d015"} Apr 16 20:30:09.193960 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:09.193756 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fn9dd" podStartSLOduration=128.9925695 podStartE2EDuration="2m11.193739559s" podCreationTimestamp="2026-04-16 20:27:58 +0000 UTC" firstStartedPulling="2026-04-16 20:30:06.84134859 +0000 UTC m=+161.812933558" lastFinishedPulling="2026-04-16 20:30:09.042518637 +0000 UTC m=+164.014103617" observedRunningTime="2026-04-16 20:30:09.192037673 +0000 UTC m=+164.163622663" watchObservedRunningTime="2026-04-16 20:30:09.193739559 +0000 UTC m=+164.165324554" Apr 16 20:30:10.181874 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:10.181832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s64xd" event={"ID":"8171246b-7193-44d0-9905-129be29085dd","Type":"ContainerStarted","Data":"a813144491b98b3572d30105b818a2f0a57b0ae14fd9f1890fb72de30ce23b0c"} Apr 16 20:30:10.197803 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:10.197754 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s64xd" podStartSLOduration=130.002071104 podStartE2EDuration="2m12.197739455s" podCreationTimestamp="2026-04-16 20:27:58 +0000 UTC" firstStartedPulling="2026-04-16 20:30:06.816928636 +0000 UTC m=+161.788513603" lastFinishedPulling="2026-04-16 20:30:09.012596973 +0000 UTC m=+163.984181954" observedRunningTime="2026-04-16 20:30:10.196513292 +0000 UTC m=+165.168098301" watchObservedRunningTime="2026-04-16 20:30:10.197739455 +0000 UTC m=+165.169324445" Apr 16 20:30:11.185216 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:11.185186 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:14.601174 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:14.601083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:30:18.283347 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.283312 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m9lzt"] Apr 16 20:30:18.285566 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.285546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.288215 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.288190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:30:18.289325 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.289304 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:30:18.289325 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.289316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:30:18.289517 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.289323 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:30:18.289517 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.289386 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-78bvs\"" Apr 16 20:30:18.298219 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.298196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m9lzt"] Apr 16 20:30:18.345224 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.345198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3949667-fdd7-46b8-822b-3b5fcf7c291e-crio-socket\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.345341 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.345297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmbz\" (UniqueName: \"kubernetes.io/projected/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-api-access-gjmbz\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.345412 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.345385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3949667-fdd7-46b8-822b-3b5fcf7c291e-data-volume\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.345468 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.345416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.345468 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.345443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3949667-fdd7-46b8-822b-3b5fcf7c291e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446025 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.445997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3949667-fdd7-46b8-822b-3b5fcf7c291e-crio-socket\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmbz\" (UniqueName: \"kubernetes.io/projected/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-api-access-gjmbz\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3949667-fdd7-46b8-822b-3b5fcf7c291e-data-volume\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3949667-fdd7-46b8-822b-3b5fcf7c291e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446148 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3949667-fdd7-46b8-822b-3b5fcf7c291e-crio-socket\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446588 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3949667-fdd7-46b8-822b-3b5fcf7c291e-data-volume\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.446688 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.446665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.448756 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.448732 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3949667-fdd7-46b8-822b-3b5fcf7c291e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.456905 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.456883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmbz\" (UniqueName: \"kubernetes.io/projected/a3949667-fdd7-46b8-822b-3b5fcf7c291e-kube-api-access-gjmbz\") pod \"insights-runtime-extractor-m9lzt\" (UID: \"a3949667-fdd7-46b8-822b-3b5fcf7c291e\") " pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.595268 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.595248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m9lzt" Apr 16 20:30:18.708619 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:18.708593 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m9lzt"] Apr 16 20:30:18.711489 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:30:18.711465 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3949667_fdd7_46b8_822b_3b5fcf7c291e.slice/crio-399f6856c0386c451a30ac8e2683a59ce9bd61c6722a3543537da5e6e03e62e4 WatchSource:0}: Error finding container 399f6856c0386c451a30ac8e2683a59ce9bd61c6722a3543537da5e6e03e62e4: Status 404 returned error can't find the container with id 399f6856c0386c451a30ac8e2683a59ce9bd61c6722a3543537da5e6e03e62e4 Apr 16 20:30:19.206818 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:19.206781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9lzt" event={"ID":"a3949667-fdd7-46b8-822b-3b5fcf7c291e","Type":"ContainerStarted","Data":"b3d5f66f52e15bba2ec716631be13602c251eeb95cd18ec9248256f2de25ae39"} Apr 16 20:30:19.207005 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:19.206822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9lzt" event={"ID":"a3949667-fdd7-46b8-822b-3b5fcf7c291e","Type":"ContainerStarted","Data":"399f6856c0386c451a30ac8e2683a59ce9bd61c6722a3543537da5e6e03e62e4"} Apr 16 20:30:20.211745 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:20.211713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9lzt" event={"ID":"a3949667-fdd7-46b8-822b-3b5fcf7c291e","Type":"ContainerStarted","Data":"0460e188123ccefd0683773d31633aae2c9e2dda909ed9a0a2bceeae53f99e9f"} Apr 16 20:30:21.190076 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:21.190009 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s64xd" Apr 16 20:30:21.217490 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:21.217456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9lzt" event={"ID":"a3949667-fdd7-46b8-822b-3b5fcf7c291e","Type":"ContainerStarted","Data":"9545a657cd3765214d500b7a13f8b38bc910a285a8dfd2d495d28b8cd1ac3ff3"} Apr 16 20:30:21.234662 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:21.234605 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m9lzt" podStartSLOduration=1.136085139 podStartE2EDuration="3.234586712s" podCreationTimestamp="2026-04-16 20:30:18 +0000 UTC" firstStartedPulling="2026-04-16 20:30:18.766631905 +0000 UTC m=+173.738216873" lastFinishedPulling="2026-04-16 20:30:20.865133473 +0000 UTC m=+175.836718446" observedRunningTime="2026-04-16 20:30:21.23333846 +0000 UTC m=+176.204923449" watchObservedRunningTime="2026-04-16 20:30:21.234586712 +0000 UTC m=+176.206171708" Apr 16 20:30:26.658504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:26.658461 2578 patch_prober.go:28] interesting pod/image-registry-5c9548d788-k476v container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:30:26.659008 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:26.658517 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5c9548d788-k476v" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:28.172814 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:28.172782 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:30:33.101990 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.101934 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9h69d"] Apr 16 20:30:33.104603 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.104581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.108583 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.108562 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:30:33.108886 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.108862 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:30:33.108886 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.108875 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:30:33.109081 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.108881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:30:33.109296 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.109281 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:30:33.109729 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.109713 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8h2bq\"" Apr 16 20:30:33.109827 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.109802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:30:33.247732 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-wtmp\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.247852 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-metrics-client-ca\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.247852 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-accelerators-collector-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.247852 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.247852 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-root\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.248096 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-textfile\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.248096 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.248096 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.247907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg29s\" (UniqueName: \"kubernetes.io/projected/f3252ab7-2575-4775-9f42-d15f54edfc88-kube-api-access-vg29s\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.248096 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.248036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-sys\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349236 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-sys\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-wtmp\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-metrics-client-ca\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-accelerators-collector-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-sys\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349584 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:33.349430 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:30:33.349584 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-wtmp\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349584 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-root\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349584 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:30:33.349500 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls podName:f3252ab7-2575-4775-9f42-d15f54edfc88 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:33.849479794 +0000 UTC m=+188.821064804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls") pod "node-exporter-9h69d" (UID: "f3252ab7-2575-4775-9f42-d15f54edfc88") : secret "node-exporter-tls" not found Apr 16 20:30:33.349584 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3252ab7-2575-4775-9f42-d15f54edfc88-root\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349832 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-textfile\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349832 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.349832 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg29s\" (UniqueName: \"kubernetes.io/projected/f3252ab7-2575-4775-9f42-d15f54edfc88-kube-api-access-vg29s\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.350005 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-textfile\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.350005 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.349964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-accelerators-collector-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.350077 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.350030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3252ab7-2575-4775-9f42-d15f54edfc88-metrics-client-ca\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.352240 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.352180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.358261 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.358231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg29s\" (UniqueName: \"kubernetes.io/projected/f3252ab7-2575-4775-9f42-d15f54edfc88-kube-api-access-vg29s\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.853041 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.853006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:33.855402 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:33.855373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3252ab7-2575-4775-9f42-d15f54edfc88-node-exporter-tls\") pod \"node-exporter-9h69d\" (UID: \"f3252ab7-2575-4775-9f42-d15f54edfc88\") " pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:34.015041 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:34.015009 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9h69d" Apr 16 20:30:34.024473 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:30:34.024437 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3252ab7_2575_4775_9f42_d15f54edfc88.slice/crio-f34e32aeffdc078a65a5d98c9debd0e248a0a9f160d2ee749cb662b0ca189049 WatchSource:0}: Error finding container f34e32aeffdc078a65a5d98c9debd0e248a0a9f160d2ee749cb662b0ca189049: Status 404 returned error can't find the container with id f34e32aeffdc078a65a5d98c9debd0e248a0a9f160d2ee749cb662b0ca189049 Apr 16 20:30:34.253266 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:34.253196 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9h69d" event={"ID":"f3252ab7-2575-4775-9f42-d15f54edfc88","Type":"ContainerStarted","Data":"f34e32aeffdc078a65a5d98c9debd0e248a0a9f160d2ee749cb662b0ca189049"} Apr 16 20:30:35.257507 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:35.257438 2578 generic.go:358] "Generic (PLEG): container finished" podID="f3252ab7-2575-4775-9f42-d15f54edfc88" containerID="a920a757204ecf3e599d6091b5cb6741c30489b71c694445b396e2f24f9a57d1" exitCode=0 Apr 16 20:30:35.257507 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:35.257490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9h69d" event={"ID":"f3252ab7-2575-4775-9f42-d15f54edfc88","Type":"ContainerDied","Data":"a920a757204ecf3e599d6091b5cb6741c30489b71c694445b396e2f24f9a57d1"} Apr 16 20:30:36.261529 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:36.261492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9h69d" event={"ID":"f3252ab7-2575-4775-9f42-d15f54edfc88","Type":"ContainerStarted","Data":"ce9f4b5f68e3152be075de2353d7ad1d56804fcdf039c9001ce2ff6605828598"} Apr 16 20:30:36.261529 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:36.261527 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9h69d" event={"ID":"f3252ab7-2575-4775-9f42-d15f54edfc88","Type":"ContainerStarted","Data":"28e0ff17d621ee79fe8d7957be0f584fcb190947631a7d9f0eace4c1b328bc7c"} Apr 16 20:30:36.284004 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:36.283918 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9h69d" podStartSLOduration=2.476246081 podStartE2EDuration="3.283906851s" podCreationTimestamp="2026-04-16 20:30:33 +0000 UTC" firstStartedPulling="2026-04-16 20:30:34.026723997 +0000 UTC m=+188.998308979" lastFinishedPulling="2026-04-16 20:30:34.834384768 +0000 UTC m=+189.805969749" observedRunningTime="2026-04-16 20:30:36.282281066 +0000 UTC m=+191.253866057" watchObservedRunningTime="2026-04-16 20:30:36.283906851 +0000 UTC m=+191.255491841" Apr 16 20:30:39.953722 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:39.953687 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:30:58.768660 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:30:58.768608 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" podUID="84e6f960-cccb-497f-b46e-31abf5235826" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:31:04.976634 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:04.976541 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c9548d788-k476v" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerName="registry" containerID="cri-o://0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3" gracePeriod=30 Apr 16 20:31:05.221412 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.221389 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:31:05.342958 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.342915 2578 generic.go:358] "Generic (PLEG): container finished" podID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerID="0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3" exitCode=0 Apr 16 20:31:05.343081 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.342977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9548d788-k476v" event={"ID":"c3360e56-bb80-45fb-aa31-ab10e294e01c","Type":"ContainerDied","Data":"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3"} Apr 16 20:31:05.343081 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.342986 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9548d788-k476v" Apr 16 20:31:05.343081 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.343004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9548d788-k476v" event={"ID":"c3360e56-bb80-45fb-aa31-ab10e294e01c","Type":"ContainerDied","Data":"eaf338d8f63360744b84aeb33ec8dcd317888dc847b8d77243637a8ed2b86951"} Apr 16 20:31:05.343081 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.343021 2578 scope.go:117] "RemoveContainer" containerID="0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3" Apr 16 20:31:05.350139 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.350120 2578 scope.go:117] "RemoveContainer" containerID="0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3" Apr 16 20:31:05.350397 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:31:05.350379 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3\": container with ID starting with 0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3 not found: ID does not exist" containerID="0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3" Apr 16 20:31:05.350455 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.350406 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3"} err="failed to get container status \"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3\": rpc error: code = NotFound desc = could not find container \"0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3\": container with ID starting with 0009d309a5c9450ead37b0430091d8a5e8224fbdb18d0b3a6ab1600da056d9f3 not found: ID does not exist" Apr 16 20:31:05.378151 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378132 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378233 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378161 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378233 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378201 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378233 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378228 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnkqc\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378250 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378272 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378294 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378324 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token\") pod \"c3360e56-bb80-45fb-aa31-ab10e294e01c\" (UID: \"c3360e56-bb80-45fb-aa31-ab10e294e01c\") " Apr 16 20:31:05.378748 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378582 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:05.378845 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.378822 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:05.380811 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.380779 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:05.381038 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.381018 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:05.381113 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.381041 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:05.381206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.381186 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:05.381258 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.381187 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc" (OuterVolumeSpecName: "kube-api-access-pnkqc") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "kube-api-access-pnkqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:05.387242 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.387220 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c3360e56-bb80-45fb-aa31-ab10e294e01c" (UID: "c3360e56-bb80-45fb-aa31-ab10e294e01c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:05.479457 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479428 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-bound-sa-token\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479457 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479457 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-trusted-ca\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479457 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479466 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3360e56-bb80-45fb-aa31-ab10e294e01c-ca-trust-extracted\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479476 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-installation-pull-secrets\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479486 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnkqc\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-kube-api-access-pnkqc\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479494 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-tls\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479502 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3360e56-bb80-45fb-aa31-ab10e294e01c-registry-certificates\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.479608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.479510 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c3360e56-bb80-45fb-aa31-ab10e294e01c-image-registry-private-configuration\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:31:05.539521 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.539500 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fn9dd_45d5c33b-d7b8-47d4-a44d-4e9b49120004/serve-healthcheck-canary/0.log" Apr 16 20:31:05.656814 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.656787 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:31:05.663355 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:05.663332 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c9548d788-k476v"] Apr 16 20:31:07.604892 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:07.604854 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" path="/var/lib/kubelet/pods/c3360e56-bb80-45fb-aa31-ab10e294e01c/volumes" Apr 16 20:31:08.768213 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:08.768177 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" podUID="84e6f960-cccb-497f-b46e-31abf5235826" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:31:18.768549 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:18.768500 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" podUID="84e6f960-cccb-497f-b46e-31abf5235826" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 20:31:18.769009 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:18.768582 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" Apr 16 20:31:18.769111 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:18.769089 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8e0ce359b14a40e23d0536e69caa44f193be80397f5e0bd0fef6cdf610603bba"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 20:31:18.769152 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:18.769134 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" podUID="84e6f960-cccb-497f-b46e-31abf5235826" containerName="service-proxy" containerID="cri-o://8e0ce359b14a40e23d0536e69caa44f193be80397f5e0bd0fef6cdf610603bba" gracePeriod=30 Apr 16 20:31:19.383271 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:19.383232 2578 generic.go:358] "Generic (PLEG): container finished" podID="84e6f960-cccb-497f-b46e-31abf5235826" containerID="8e0ce359b14a40e23d0536e69caa44f193be80397f5e0bd0fef6cdf610603bba" exitCode=2 Apr 16 20:31:19.383446 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:19.383309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerDied","Data":"8e0ce359b14a40e23d0536e69caa44f193be80397f5e0bd0fef6cdf610603bba"} Apr 16 20:31:19.383446 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:19.383349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c976f769d-jk2zs" event={"ID":"84e6f960-cccb-497f-b46e-31abf5235826","Type":"ContainerStarted","Data":"9e3ba6162f8832e60c11225d1ab89bf3c8920dc5ea0802194d88d7d66f26a9d8"} Apr 16 20:31:36.486821 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:36.486776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:31:36.489181 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:36.489161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931ff401-6150-4e87-828a-2e3a9242e1bc-metrics-certs\") pod \"network-metrics-daemon-tjgd4\" (UID: \"931ff401-6150-4e87-828a-2e3a9242e1bc\") " pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:31:36.504598 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:36.504578 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:31:36.512447 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:36.512433 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tjgd4" Apr 16 20:31:36.625800 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:36.625775 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tjgd4"] Apr 16 20:31:36.628711 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:31:36.628683 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931ff401_6150_4e87_828a_2e3a9242e1bc.slice/crio-ca658882907fe2c35f05e832722be67e0f36951dc05ed2456d025b3e2d5c7d60 WatchSource:0}: Error finding container ca658882907fe2c35f05e832722be67e0f36951dc05ed2456d025b3e2d5c7d60: Status 404 returned error can't find the container with id ca658882907fe2c35f05e832722be67e0f36951dc05ed2456d025b3e2d5c7d60 Apr 16 20:31:37.431441 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:37.431405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tjgd4" event={"ID":"931ff401-6150-4e87-828a-2e3a9242e1bc","Type":"ContainerStarted","Data":"ca658882907fe2c35f05e832722be67e0f36951dc05ed2456d025b3e2d5c7d60"} Apr 16 20:31:38.436954 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:38.436915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tjgd4" event={"ID":"931ff401-6150-4e87-828a-2e3a9242e1bc","Type":"ContainerStarted","Data":"bc3f2ea7afd0898b94f6624650dc2743e0b8268b96635052ec4ab0ca4479ed9a"} Apr 16 20:31:38.437400 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:38.436980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tjgd4" event={"ID":"931ff401-6150-4e87-828a-2e3a9242e1bc","Type":"ContainerStarted","Data":"c3b3d4c7d8d01c3b109fa45cb720abca321966f9b7f845e8c5ccf55c4a43d092"} Apr 16 20:31:38.452458 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:31:38.452404 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tjgd4" podStartSLOduration=252.138881503 podStartE2EDuration="4m13.452388971s" podCreationTimestamp="2026-04-16 20:27:25 +0000 UTC" firstStartedPulling="2026-04-16 20:31:36.631166746 +0000 UTC m=+251.602751726" lastFinishedPulling="2026-04-16 20:31:37.944674212 +0000 UTC m=+252.916259194" observedRunningTime="2026-04-16 20:31:38.451537631 +0000 UTC m=+253.423122622" watchObservedRunningTime="2026-04-16 20:31:38.452388971 +0000 UTC m=+253.423973961" Apr 16 20:32:05.143494 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:32:05.143436 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" podUID="35d37cfc-3882-4be0-ac36-d1be745ae717" Apr 16 20:32:05.510521 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:05.510446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:32:08.395688 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:08.395638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:32:08.398323 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:08.398299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/35d37cfc-3882-4be0-ac36-d1be745ae717-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8s6gb\" (UID: \"35d37cfc-3882-4be0-ac36-d1be745ae717\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:32:08.513657 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:08.513629 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5wwv5\"" Apr 16 20:32:08.521335 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:08.521311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" Apr 16 20:32:08.640909 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:08.640858 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb"] Apr 16 20:32:08.642955 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:32:08.642914 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d37cfc_3882_4be0_ac36_d1be745ae717.slice/crio-819048ce4ba69722468c12af254e7bb1ee8dc2f154fc46226b419c38f4232ea8 WatchSource:0}: Error finding container 819048ce4ba69722468c12af254e7bb1ee8dc2f154fc46226b419c38f4232ea8: Status 404 returned error can't find the container with id 819048ce4ba69722468c12af254e7bb1ee8dc2f154fc46226b419c38f4232ea8 Apr 16 20:32:09.523280 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:09.523238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" event={"ID":"35d37cfc-3882-4be0-ac36-d1be745ae717","Type":"ContainerStarted","Data":"819048ce4ba69722468c12af254e7bb1ee8dc2f154fc46226b419c38f4232ea8"} Apr 16 20:32:10.527381 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:10.527344 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" event={"ID":"35d37cfc-3882-4be0-ac36-d1be745ae717","Type":"ContainerStarted","Data":"972d333335dfb867a504c1e5e026183d78612cf4187bf4d52f30f7acb68acffd"} Apr 16 20:32:10.546579 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:10.546531 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8s6gb" podStartSLOduration=278.598741466 podStartE2EDuration="4m39.546515673s" podCreationTimestamp="2026-04-16 20:27:31 +0000 UTC" firstStartedPulling="2026-04-16 20:32:08.644770119 +0000 UTC m=+283.616355088" lastFinishedPulling="2026-04-16 20:32:09.592544313 +0000 UTC m=+284.564129295" observedRunningTime="2026-04-16 20:32:10.545064116 +0000 UTC m=+285.516649105" watchObservedRunningTime="2026-04-16 20:32:10.546515673 +0000 UTC m=+285.518100663" Apr 16 20:32:25.489388 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:32:25.489360 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:33:45.131297 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.131267 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-5dsqt"] Apr 16 20:33:45.131735 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.131467 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerName="registry" Apr 16 20:33:45.131735 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.131476 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerName="registry" Apr 16 20:33:45.131735 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.131525 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3360e56-bb80-45fb-aa31-ab10e294e01c" containerName="registry" Apr 16 20:33:45.133216 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.133201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.137334 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.137304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:33:45.137474 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.137402 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dqvhs\"" Apr 16 20:33:45.137474 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.137422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:33:45.148602 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.148582 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5dsqt"] Apr 16 20:33:45.225639 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.225613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-bound-sa-token\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.225729 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.225645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshld\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-kube-api-access-bshld\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.325876 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.325855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-bound-sa-token\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.325985 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.325882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bshld\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-kube-api-access-bshld\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.333776 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.333751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshld\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-kube-api-access-bshld\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.333858 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.333777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81e3d696-60b7-4692-9e05-6892c53f9e24-bound-sa-token\") pod \"cert-manager-759f64656b-5dsqt\" (UID: \"81e3d696-60b7-4692-9e05-6892c53f9e24\") " pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.441151 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.441090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5dsqt" Apr 16 20:33:45.553029 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.552998 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5dsqt"] Apr 16 20:33:45.558812 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:33:45.558788 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e3d696_60b7_4692_9e05_6892c53f9e24.slice/crio-95c9b991408ab9eb8ecb56a7585615481d829339b36ade89332d3283103cb034 WatchSource:0}: Error finding container 95c9b991408ab9eb8ecb56a7585615481d829339b36ade89332d3283103cb034: Status 404 returned error can't find the container with id 95c9b991408ab9eb8ecb56a7585615481d829339b36ade89332d3283103cb034 Apr 16 20:33:45.560504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.560488 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:33:45.770059 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:45.769999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5dsqt" event={"ID":"81e3d696-60b7-4692-9e05-6892c53f9e24","Type":"ContainerStarted","Data":"95c9b991408ab9eb8ecb56a7585615481d829339b36ade89332d3283103cb034"} Apr 16 20:33:48.779713 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:48.779681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5dsqt" event={"ID":"81e3d696-60b7-4692-9e05-6892c53f9e24","Type":"ContainerStarted","Data":"fa01735aa1fedc5fca0ece6c261b091fca894cc914297899227998d8be0b74dc"} Apr 16 20:33:48.797436 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:33:48.797390 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-5dsqt" podStartSLOduration=0.763952527 podStartE2EDuration="3.797376849s" podCreationTimestamp="2026-04-16 20:33:45 +0000 UTC" firstStartedPulling="2026-04-16 20:33:45.560640031 +0000 UTC m=+380.532225000" lastFinishedPulling="2026-04-16 20:33:48.594064342 +0000 UTC m=+383.565649322" observedRunningTime="2026-04-16 20:33:48.797134555 +0000 UTC m=+383.768719545" watchObservedRunningTime="2026-04-16 20:33:48.797376849 +0000 UTC m=+383.768961838" Apr 16 20:34:11.659953 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.659902 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94"] Apr 16 20:34:11.662129 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.662109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.665058 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.665032 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 20:34:11.665270 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.665246 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 20:34:11.665373 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.665322 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-k9btq\"" Apr 16 20:34:11.665599 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.665581 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 20:34:11.665910 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.665891 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 20:34:11.686607 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.686582 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94"] Apr 16 20:34:11.802135 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.802105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm5l9\" (UniqueName: \"kubernetes.io/projected/83f05402-452a-4fe4-b250-fb4c7575b53c-kube-api-access-jm5l9\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.802231 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.802146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.802231 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.802205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.903054 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.903034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.903174 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.903072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm5l9\" (UniqueName: \"kubernetes.io/projected/83f05402-452a-4fe4-b250-fb4c7575b53c-kube-api-access-jm5l9\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.903174 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.903101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.905325 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.905306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.905421 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.905378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/83f05402-452a-4fe4-b250-fb4c7575b53c-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.912914 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.912864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm5l9\" (UniqueName: \"kubernetes.io/projected/83f05402-452a-4fe4-b250-fb4c7575b53c-kube-api-access-jm5l9\") pod \"opendatahub-operator-controller-manager-6cc777b675-fkh94\" (UID: \"83f05402-452a-4fe4-b250-fb4c7575b53c\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:11.972790 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:11.972769 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:12.100232 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:12.100206 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94"] Apr 16 20:34:12.102850 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:12.102825 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f05402_452a_4fe4_b250_fb4c7575b53c.slice/crio-a279455307d8cbd93dffa0dad49e5b58b008afa19a45828cf9f2191efdf6e0c4 WatchSource:0}: Error finding container a279455307d8cbd93dffa0dad49e5b58b008afa19a45828cf9f2191efdf6e0c4: Status 404 returned error can't find the container with id a279455307d8cbd93dffa0dad49e5b58b008afa19a45828cf9f2191efdf6e0c4 Apr 16 20:34:12.840117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:12.840071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" event={"ID":"83f05402-452a-4fe4-b250-fb4c7575b53c","Type":"ContainerStarted","Data":"a279455307d8cbd93dffa0dad49e5b58b008afa19a45828cf9f2191efdf6e0c4"} Apr 16 20:34:14.847586 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:14.847553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" event={"ID":"83f05402-452a-4fe4-b250-fb4c7575b53c","Type":"ContainerStarted","Data":"f50b818191f3784402ebe6651c87b2b9badc796fdd906e8bb5f6172f4afd072d"} Apr 16 20:34:14.848009 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:14.847778 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:14.873672 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:14.873618 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" podStartSLOduration=1.4444170889999999 podStartE2EDuration="3.873601725s" podCreationTimestamp="2026-04-16 20:34:11 +0000 UTC" firstStartedPulling="2026-04-16 20:34:12.104510602 +0000 UTC m=+407.076095569" lastFinishedPulling="2026-04-16 20:34:14.533695237 +0000 UTC m=+409.505280205" observedRunningTime="2026-04-16 20:34:14.871284755 +0000 UTC m=+409.842869748" watchObservedRunningTime="2026-04-16 20:34:14.873601725 +0000 UTC m=+409.845186716" Apr 16 20:34:25.851644 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:25.851608 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-fkh94" Apr 16 20:34:31.507736 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.507702 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-6kwjn"] Apr 16 20:34:31.512216 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.512197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.516077 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.516043 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 20:34:31.516077 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.516058 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-7kbg2\"" Apr 16 20:34:31.516267 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.516063 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:34:31.516267 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.516057 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 20:34:31.516267 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.516101 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:34:31.520362 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.520344 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-6kwjn"] Apr 16 20:34:31.529226 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.529208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe0099e-dd6d-420b-aba3-0e68ed045212-tls-certs\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.529319 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.529235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjxh\" (UniqueName: \"kubernetes.io/projected/9fe0099e-dd6d-420b-aba3-0e68ed045212-kube-api-access-fvjxh\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.529319 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.529255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe0099e-dd6d-420b-aba3-0e68ed045212-tmp\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.630407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.630382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe0099e-dd6d-420b-aba3-0e68ed045212-tls-certs\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.630407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.630414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjxh\" (UniqueName: \"kubernetes.io/projected/9fe0099e-dd6d-420b-aba3-0e68ed045212-kube-api-access-fvjxh\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.630624 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.630437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe0099e-dd6d-420b-aba3-0e68ed045212-tmp\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.632686 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.632653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe0099e-dd6d-420b-aba3-0e68ed045212-tmp\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.632915 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.632894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe0099e-dd6d-420b-aba3-0e68ed045212-tls-certs\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.642147 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.642124 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjxh\" (UniqueName: \"kubernetes.io/projected/9fe0099e-dd6d-420b-aba3-0e68ed045212-kube-api-access-fvjxh\") pod \"kube-auth-proxy-86666bf97-6kwjn\" (UID: \"9fe0099e-dd6d-420b-aba3-0e68ed045212\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.821818 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.821762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" Apr 16 20:34:31.939214 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:31.939181 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-6kwjn"] Apr 16 20:34:31.942512 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:31.942486 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe0099e_dd6d_420b_aba3_0e68ed045212.slice/crio-eab794bf071df791262dd603bc1b7de21de998d4fd7a959604cddb2ff162481a WatchSource:0}: Error finding container eab794bf071df791262dd603bc1b7de21de998d4fd7a959604cddb2ff162481a: Status 404 returned error can't find the container with id eab794bf071df791262dd603bc1b7de21de998d4fd7a959604cddb2ff162481a Apr 16 20:34:32.898589 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:32.898547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" event={"ID":"9fe0099e-dd6d-420b-aba3-0e68ed045212","Type":"ContainerStarted","Data":"eab794bf071df791262dd603bc1b7de21de998d4fd7a959604cddb2ff162481a"} Apr 16 20:34:34.090525 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.090493 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-czrzg"] Apr 16 20:34:34.092463 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.092445 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.094818 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.094795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:34:34.094958 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.094847 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-d262w\"" Apr 16 20:34:34.100358 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.100338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-czrzg"] Apr 16 20:34:34.151332 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.151291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgh5\" (UniqueName: \"kubernetes.io/projected/7ec644c2-6575-433d-bff7-4602a99b9453-kube-api-access-rdgh5\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.151477 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.151385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.251990 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.251930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgh5\" (UniqueName: \"kubernetes.io/projected/7ec644c2-6575-433d-bff7-4602a99b9453-kube-api-access-rdgh5\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.252173 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.252029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.252173 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:34.252163 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 20:34:34.252287 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:34.252254 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert podName:7ec644c2-6575-433d-bff7-4602a99b9453 nodeName:}" failed. No retries permitted until 2026-04-16 20:34:34.752234121 +0000 UTC m=+429.723819092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert") pod "odh-model-controller-858dbf95b8-czrzg" (UID: "7ec644c2-6575-433d-bff7-4602a99b9453") : secret "odh-model-controller-webhook-cert" not found Apr 16 20:34:34.260802 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.260778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgh5\" (UniqueName: \"kubernetes.io/projected/7ec644c2-6575-433d-bff7-4602a99b9453-kube-api-access-rdgh5\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.756854 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.756818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.759913 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.759862 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ec644c2-6575-433d-bff7-4602a99b9453-cert\") pod \"odh-model-controller-858dbf95b8-czrzg\" (UID: \"7ec644c2-6575-433d-bff7-4602a99b9453\") " pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:34.904956 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.904919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" event={"ID":"9fe0099e-dd6d-420b-aba3-0e68ed045212","Type":"ContainerStarted","Data":"862196245b6dc1cfb92975755e1135c4b4e9f2961a7081f9e2a634f76251366f"} Apr 16 20:34:34.921496 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:34.921456 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-86666bf97-6kwjn" podStartSLOduration=1.033684525 podStartE2EDuration="3.921444543s" podCreationTimestamp="2026-04-16 20:34:31 +0000 UTC" firstStartedPulling="2026-04-16 20:34:31.944121508 +0000 UTC m=+426.915706477" lastFinishedPulling="2026-04-16 20:34:34.831881517 +0000 UTC m=+429.803466495" observedRunningTime="2026-04-16 20:34:34.92024461 +0000 UTC m=+429.891829602" watchObservedRunningTime="2026-04-16 20:34:34.921444543 +0000 UTC m=+429.893029532" Apr 16 20:34:35.002419 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:35.002392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:35.125363 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:35.125333 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-czrzg"] Apr 16 20:34:35.128245 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:35.128217 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec644c2_6575_433d_bff7_4602a99b9453.slice/crio-daa3734d3454b95bc57b5d4d0b0af9f0ffc002d1dccb79a35769bdf6870b3915 WatchSource:0}: Error finding container daa3734d3454b95bc57b5d4d0b0af9f0ffc002d1dccb79a35769bdf6870b3915: Status 404 returned error can't find the container with id daa3734d3454b95bc57b5d4d0b0af9f0ffc002d1dccb79a35769bdf6870b3915 Apr 16 20:34:35.909502 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:35.909467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" event={"ID":"7ec644c2-6575-433d-bff7-4602a99b9453","Type":"ContainerStarted","Data":"daa3734d3454b95bc57b5d4d0b0af9f0ffc002d1dccb79a35769bdf6870b3915"} Apr 16 20:34:37.916350 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:37.916286 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ec644c2-6575-433d-bff7-4602a99b9453" containerID="6aa571b40eb0675fd16c7ae2c82fabbe43b5affaca872891b3a8190ca4c65ee9" exitCode=1 Apr 16 20:34:37.916633 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:37.916361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" event={"ID":"7ec644c2-6575-433d-bff7-4602a99b9453","Type":"ContainerDied","Data":"6aa571b40eb0675fd16c7ae2c82fabbe43b5affaca872891b3a8190ca4c65ee9"} Apr 16 20:34:37.916633 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:37.916519 2578 scope.go:117] "RemoveContainer" containerID="6aa571b40eb0675fd16c7ae2c82fabbe43b5affaca872891b3a8190ca4c65ee9" Apr 16 20:34:38.920593 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:38.920558 2578 generic.go:358] "Generic (PLEG): container finished" podID="7ec644c2-6575-433d-bff7-4602a99b9453" containerID="daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2" exitCode=1 Apr 16 20:34:38.921016 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:38.920647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" event={"ID":"7ec644c2-6575-433d-bff7-4602a99b9453","Type":"ContainerDied","Data":"daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2"} Apr 16 20:34:38.921016 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:38.920684 2578 scope.go:117] "RemoveContainer" containerID="6aa571b40eb0675fd16c7ae2c82fabbe43b5affaca872891b3a8190ca4c65ee9" Apr 16 20:34:38.921016 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:38.920831 2578 scope.go:117] "RemoveContainer" containerID="daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2" Apr 16 20:34:38.921123 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:38.921055 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-czrzg_opendatahub(7ec644c2-6575-433d-bff7-4602a99b9453)\"" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" podUID="7ec644c2-6575-433d-bff7-4602a99b9453" Apr 16 20:34:39.770589 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.770565 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cqtn2"] Apr 16 20:34:39.772565 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.772550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.776162 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.776134 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-gqfl2\"" Apr 16 20:34:39.776816 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.776794 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 20:34:39.782597 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.782575 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cqtn2"] Apr 16 20:34:39.793428 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.793399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.793524 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.793467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdm4w\" (UniqueName: \"kubernetes.io/projected/c16e2366-e1de-43df-9748-293440238351-kube-api-access-cdm4w\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.894571 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.894546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.894664 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.894599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdm4w\" (UniqueName: \"kubernetes.io/projected/c16e2366-e1de-43df-9748-293440238351-kube-api-access-cdm4w\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.894714 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:39.894677 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 20:34:39.894757 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:39.894732 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert podName:c16e2366-e1de-43df-9748-293440238351 nodeName:}" failed. No retries permitted until 2026-04-16 20:34:40.39471669 +0000 UTC m=+435.366301658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert") pod "kserve-controller-manager-856948b99f-cqtn2" (UID: "c16e2366-e1de-43df-9748-293440238351") : secret "kserve-webhook-server-cert" not found Apr 16 20:34:39.905655 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.905637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdm4w\" (UniqueName: \"kubernetes.io/projected/c16e2366-e1de-43df-9748-293440238351-kube-api-access-cdm4w\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:39.925090 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:39.925071 2578 scope.go:117] "RemoveContainer" containerID="daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2" Apr 16 20:34:39.925363 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:39.925262 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-czrzg_opendatahub(7ec644c2-6575-433d-bff7-4602a99b9453)\"" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" podUID="7ec644c2-6575-433d-bff7-4602a99b9453" Apr 16 20:34:40.398848 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:40.398822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:40.401055 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:40.401036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c16e2366-e1de-43df-9748-293440238351-cert\") pod \"kserve-controller-manager-856948b99f-cqtn2\" (UID: \"c16e2366-e1de-43df-9748-293440238351\") " pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:40.684448 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:40.684385 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:40.798257 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:40.798212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cqtn2"] Apr 16 20:34:40.800716 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:40.800691 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc16e2366_e1de_43df_9748_293440238351.slice/crio-10459eda1e99bf1e612a61a45b05d1fe56a534b39c7ddd765bfaebad7f7d2187 WatchSource:0}: Error finding container 10459eda1e99bf1e612a61a45b05d1fe56a534b39c7ddd765bfaebad7f7d2187: Status 404 returned error can't find the container with id 10459eda1e99bf1e612a61a45b05d1fe56a534b39c7ddd765bfaebad7f7d2187 Apr 16 20:34:40.928492 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:40.928466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" event={"ID":"c16e2366-e1de-43df-9748-293440238351","Type":"ContainerStarted","Data":"10459eda1e99bf1e612a61a45b05d1fe56a534b39c7ddd765bfaebad7f7d2187"} Apr 16 20:34:43.938584 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:43.938548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" event={"ID":"c16e2366-e1de-43df-9748-293440238351","Type":"ContainerStarted","Data":"487de23230894069082a86f74f4d00824f0cb55434af8288668899562083d379"} Apr 16 20:34:43.938968 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:43.938743 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:34:44.009161 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:44.009116 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" podStartSLOduration=2.673056036 podStartE2EDuration="5.009103355s" podCreationTimestamp="2026-04-16 20:34:39 +0000 UTC" firstStartedPulling="2026-04-16 20:34:40.802283571 +0000 UTC m=+435.773868552" lastFinishedPulling="2026-04-16 20:34:43.138330898 +0000 UTC m=+438.109915871" observedRunningTime="2026-04-16 20:34:44.009067488 +0000 UTC m=+438.980652647" watchObservedRunningTime="2026-04-16 20:34:44.009103355 +0000 UTC m=+438.980688344" Apr 16 20:34:45.002600 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.002569 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:45.002988 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.002914 2578 scope.go:117] "RemoveContainer" containerID="daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2" Apr 16 20:34:45.003092 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:34:45.003074 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-czrzg_opendatahub(7ec644c2-6575-433d-bff7-4602a99b9453)\"" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" podUID="7ec644c2-6575-433d-bff7-4602a99b9453" Apr 16 20:34:45.784413 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.784382 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp"] Apr 16 20:34:45.787299 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.787283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.790214 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.790194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 20:34:45.790317 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.790198 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 20:34:45.790317 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.790244 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-psghl\"" Apr 16 20:34:45.807554 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.807533 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp"] Apr 16 20:34:45.834554 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.834533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.834653 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.834568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm8bw\" (UniqueName: \"kubernetes.io/projected/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-kube-api-access-qm8bw\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.935811 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.935788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.935911 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.935835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm8bw\" (UniqueName: \"kubernetes.io/projected/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-kube-api-access-qm8bw\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.938350 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.938325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:45.945305 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:45.945284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm8bw\" (UniqueName: \"kubernetes.io/projected/2fa379b2-87d5-44fc-bb3b-ebfb8b83da02-kube-api-access-qm8bw\") pod \"servicemesh-operator3-55f49c5f94-hw9fp\" (UID: \"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:46.095289 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:46.095264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:46.221859 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:46.221835 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp"] Apr 16 20:34:46.224550 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:46.224518 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa379b2_87d5_44fc_bb3b_ebfb8b83da02.slice/crio-7b6636c6fa1db4988bccb3520ec4f2757c4424bfac8c91c7a8cfc0241d15ced6 WatchSource:0}: Error finding container 7b6636c6fa1db4988bccb3520ec4f2757c4424bfac8c91c7a8cfc0241d15ced6: Status 404 returned error can't find the container with id 7b6636c6fa1db4988bccb3520ec4f2757c4424bfac8c91c7a8cfc0241d15ced6 Apr 16 20:34:46.949087 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:46.949050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" event={"ID":"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02","Type":"ContainerStarted","Data":"7b6636c6fa1db4988bccb3520ec4f2757c4424bfac8c91c7a8cfc0241d15ced6"} Apr 16 20:34:50.965077 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:50.965040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" event={"ID":"2fa379b2-87d5-44fc-bb3b-ebfb8b83da02","Type":"ContainerStarted","Data":"60c9abdb25fc3a52f7d7b6a708266d9c86766fb77a0d58d648d7ca7be437d953"} Apr 16 20:34:50.965504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:50.965165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:34:50.986481 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:50.986437 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" podStartSLOduration=1.692267038 podStartE2EDuration="5.986423695s" podCreationTimestamp="2026-04-16 20:34:45 +0000 UTC" firstStartedPulling="2026-04-16 20:34:46.227147474 +0000 UTC m=+441.198732441" lastFinishedPulling="2026-04-16 20:34:50.521304127 +0000 UTC m=+445.492889098" observedRunningTime="2026-04-16 20:34:50.984494644 +0000 UTC m=+445.956079633" watchObservedRunningTime="2026-04-16 20:34:50.986423695 +0000 UTC m=+445.958008684" Apr 16 20:34:55.003004 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:55.002966 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:55.003411 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:55.003335 2578 scope.go:117] "RemoveContainer" containerID="daae7592de4a9696fe0c8faecb5fcdb3d846f86e4fae6b01f0f415e18e96bdd2" Apr 16 20:34:55.981996 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:55.981964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" event={"ID":"7ec644c2-6575-433d-bff7-4602a99b9453","Type":"ContainerStarted","Data":"28e7d534bc0e4d3f8a81f7d98c355d81df739f0c98ef8c3285fd2d1be3038a47"} Apr 16 20:34:55.982184 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:55.982165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:34:55.997733 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:55.997692 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" podStartSLOduration=1.860972443 podStartE2EDuration="21.997681474s" podCreationTimestamp="2026-04-16 20:34:34 +0000 UTC" firstStartedPulling="2026-04-16 20:34:35.129580025 +0000 UTC m=+430.101164992" lastFinishedPulling="2026-04-16 20:34:55.266289055 +0000 UTC m=+450.237874023" observedRunningTime="2026-04-16 20:34:55.997081279 +0000 UTC m=+450.968666276" watchObservedRunningTime="2026-04-16 20:34:55.997681474 +0000 UTC m=+450.969266488" Apr 16 20:34:56.744309 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.744281 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n"] Apr 16 20:34:56.746883 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.746864 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.749154 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.749136 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 20:34:56.749436 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.749416 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:34:56.749521 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.749502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 20:34:56.749789 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.749696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-vvvnf\"" Apr 16 20:34:56.750382 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.750364 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 20:34:56.759430 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.759405 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n"] Apr 16 20:34:56.811059 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811159 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811159 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/03faef76-6068-411c-8010-ca4f5dcfafe0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811159 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811255 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww54v\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-kube-api-access-ww54v\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811255 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.811255 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.811223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912520 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912632 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912632 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/03faef76-6068-411c-8010-ca4f5dcfafe0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912632 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912632 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww54v\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-kube-api-access-ww54v\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912837 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.912837 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.912705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.913371 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.913343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.914983 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.914962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/03faef76-6068-411c-8010-ca4f5dcfafe0-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.915183 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.915162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.915297 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.915277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.915426 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.915412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/03faef76-6068-411c-8010-ca4f5dcfafe0-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.920072 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.920039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:56.920165 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:56.920082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww54v\" (UniqueName: \"kubernetes.io/projected/03faef76-6068-411c-8010-ca4f5dcfafe0-kube-api-access-ww54v\") pod \"istiod-openshift-gateway-55ff986f96-72z8n\" (UID: \"03faef76-6068-411c-8010-ca4f5dcfafe0\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:57.057351 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:57.057259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:34:57.183993 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:57.183966 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n"] Apr 16 20:34:57.185610 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:34:57.185584 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03faef76_6068_411c_8010_ca4f5dcfafe0.slice/crio-8c181fc47b632f75e0f631a30c2e5f9c5028d2263cd479a7b3302f412696de5d WatchSource:0}: Error finding container 8c181fc47b632f75e0f631a30c2e5f9c5028d2263cd479a7b3302f412696de5d: Status 404 returned error can't find the container with id 8c181fc47b632f75e0f631a30c2e5f9c5028d2263cd479a7b3302f412696de5d Apr 16 20:34:57.995095 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:34:57.995052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" event={"ID":"03faef76-6068-411c-8010-ca4f5dcfafe0","Type":"ContainerStarted","Data":"8c181fc47b632f75e0f631a30c2e5f9c5028d2263cd479a7b3302f412696de5d"} Apr 16 20:35:00.169814 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:00.169776 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:35:00.170134 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:00.169848 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:35:01.010885 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.010855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" event={"ID":"03faef76-6068-411c-8010-ca4f5dcfafe0","Type":"ContainerStarted","Data":"dbf0f7b713095e519ed2df46e940d6c7689efe19310c19dec491fe2bfe50760c"} Apr 16 20:35:01.011092 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.011074 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:35:01.012703 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.012672 2578 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-72z8n container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 20:35:01.012821 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.012729 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" podUID="03faef76-6068-411c-8010-ca4f5dcfafe0" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:35:01.032608 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.032559 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" podStartSLOduration=2.050555974 podStartE2EDuration="5.032547916s" podCreationTimestamp="2026-04-16 20:34:56 +0000 UTC" firstStartedPulling="2026-04-16 20:34:57.18757036 +0000 UTC m=+452.159155327" lastFinishedPulling="2026-04-16 20:35:00.169562301 +0000 UTC m=+455.141147269" observedRunningTime="2026-04-16 20:35:01.030953997 +0000 UTC m=+456.002538978" watchObservedRunningTime="2026-04-16 20:35:01.032547916 +0000 UTC m=+456.004132900" Apr 16 20:35:01.970875 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:01.970843 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hw9fp" Apr 16 20:35:02.015916 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:02.015879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-72z8n" Apr 16 20:35:06.990425 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:06.990398 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-czrzg" Apr 16 20:35:14.946589 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:35:14.946562 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-cqtn2" Apr 16 20:36:10.904104 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.904066 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2"] Apr 16 20:36:10.906480 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.906460 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:10.909113 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.909084 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-797v8\"" Apr 16 20:36:10.909113 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.909094 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:36:10.909257 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.909094 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:36:10.916638 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:10.916617 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2"] Apr 16 20:36:11.012728 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.012700 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.012853 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.012735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspvr\" (UniqueName: \"kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.113088 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.113067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.113214 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.113098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vspvr\" (UniqueName: \"kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.113444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.113424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.121194 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.121177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspvr\" (UniqueName: \"kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.216823 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.216774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:11.378607 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:11.378554 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2"] Apr 16 20:36:11.381922 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:36:11.381899 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode82adce6_96ad_41f7_b850_5d0c495f0d70.slice/crio-f45037ba3e00915c7f495e91c4676cf069b7807ae5d9554fc1e94bfa7772aaf9 WatchSource:0}: Error finding container f45037ba3e00915c7f495e91c4676cf069b7807ae5d9554fc1e94bfa7772aaf9: Status 404 returned error can't find the container with id f45037ba3e00915c7f495e91c4676cf069b7807ae5d9554fc1e94bfa7772aaf9 Apr 16 20:36:12.247492 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:12.247460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" event={"ID":"e82adce6-96ad-41f7-b850-5d0c495f0d70","Type":"ContainerStarted","Data":"f45037ba3e00915c7f495e91c4676cf069b7807ae5d9554fc1e94bfa7772aaf9"} Apr 16 20:36:17.266277 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:17.266197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" event={"ID":"e82adce6-96ad-41f7-b850-5d0c495f0d70","Type":"ContainerStarted","Data":"59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d"} Apr 16 20:36:17.266277 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:17.266254 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:17.284882 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:17.284839 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" podStartSLOduration=1.687063749 podStartE2EDuration="7.284827573s" podCreationTimestamp="2026-04-16 20:36:10 +0000 UTC" firstStartedPulling="2026-04-16 20:36:11.384170303 +0000 UTC m=+526.355755271" lastFinishedPulling="2026-04-16 20:36:16.981934109 +0000 UTC m=+531.953519095" observedRunningTime="2026-04-16 20:36:17.283713277 +0000 UTC m=+532.255298281" watchObservedRunningTime="2026-04-16 20:36:17.284827573 +0000 UTC m=+532.256412563" Apr 16 20:36:28.271496 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:28.271468 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:29.262142 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.262107 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2"] Apr 16 20:36:29.262375 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.262351 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" containerName="manager" containerID="cri-o://59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d" gracePeriod=2 Apr 16 20:36:29.271722 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.271685 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2"] Apr 16 20:36:29.316498 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.316470 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:29.316736 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.316725 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" containerName="manager" Apr 16 20:36:29.316782 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.316737 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" containerName="manager" Apr 16 20:36:29.316815 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.316782 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" containerName="manager" Apr 16 20:36:29.318537 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.318523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.340842 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.340794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.341028 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.340998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfwfw\" (UniqueName: \"kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.345918 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.345889 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:29.394743 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.394710 2578 status_manager.go:895] "Failed to get status for pod" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" is forbidden: User \"system:node:ip-10-0-137-53.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-53.ec2.internal' and this object" Apr 16 20:36:29.441525 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.441497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfwfw\" (UniqueName: \"kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.441654 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.441558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.441868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.441851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.471407 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.471380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfwfw\" (UniqueName: \"kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-96dxn\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.488477 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.488456 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:29.495907 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.495883 2578 status_manager.go:895] "Failed to get status for pod" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-m8pj2\" is forbidden: User \"system:node:ip-10-0-137-53.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-53.ec2.internal' and this object" Apr 16 20:36:29.542233 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.542188 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vspvr\" (UniqueName: \"kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr\") pod \"e82adce6-96ad-41f7-b850-5d0c495f0d70\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " Apr 16 20:36:29.542310 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.542236 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume\") pod \"e82adce6-96ad-41f7-b850-5d0c495f0d70\" (UID: \"e82adce6-96ad-41f7-b850-5d0c495f0d70\") " Apr 16 20:36:29.542615 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.542593 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e82adce6-96ad-41f7-b850-5d0c495f0d70" (UID: "e82adce6-96ad-41f7-b850-5d0c495f0d70"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:29.544241 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.544222 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr" (OuterVolumeSpecName: "kube-api-access-vspvr") pod "e82adce6-96ad-41f7-b850-5d0c495f0d70" (UID: "e82adce6-96ad-41f7-b850-5d0c495f0d70"). InnerVolumeSpecName "kube-api-access-vspvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:29.604995 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.604972 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82adce6-96ad-41f7-b850-5d0c495f0d70" path="/var/lib/kubelet/pods/e82adce6-96ad-41f7-b850-5d0c495f0d70/volumes" Apr 16 20:36:29.642769 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.642747 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e82adce6-96ad-41f7-b850-5d0c495f0d70-extensions-socket-volume\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:36:29.642769 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.642766 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vspvr\" (UniqueName: \"kubernetes.io/projected/e82adce6-96ad-41f7-b850-5d0c495f0d70-kube-api-access-vspvr\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:36:29.651771 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.651749 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:29.783065 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:29.782974 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:29.785278 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:36:29.785253 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0e5188_734f_4283_b72f_2b5f27c1a610.slice/crio-2d17f14c68fabd41bb94458f20e6a57ba4ced807d85469ff15b06573efd15209 WatchSource:0}: Error finding container 2d17f14c68fabd41bb94458f20e6a57ba4ced807d85469ff15b06573efd15209: Status 404 returned error can't find the container with id 2d17f14c68fabd41bb94458f20e6a57ba4ced807d85469ff15b06573efd15209 Apr 16 20:36:30.306332 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.306298 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" event={"ID":"9e0e5188-734f-4283-b72f-2b5f27c1a610","Type":"ContainerStarted","Data":"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48"} Apr 16 20:36:30.306772 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.306340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" event={"ID":"9e0e5188-734f-4283-b72f-2b5f27c1a610","Type":"ContainerStarted","Data":"2d17f14c68fabd41bb94458f20e6a57ba4ced807d85469ff15b06573efd15209"} Apr 16 20:36:30.306772 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.306399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:30.307585 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.307563 2578 generic.go:358] "Generic (PLEG): container finished" podID="e82adce6-96ad-41f7-b850-5d0c495f0d70" containerID="59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d" exitCode=0 Apr 16 20:36:30.307707 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.307618 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-m8pj2" Apr 16 20:36:30.307707 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.307605 2578 scope.go:117] "RemoveContainer" containerID="59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d" Apr 16 20:36:30.315585 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.315568 2578 scope.go:117] "RemoveContainer" containerID="59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d" Apr 16 20:36:30.315822 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:36:30.315806 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d\": container with ID starting with 59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d not found: ID does not exist" containerID="59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d" Apr 16 20:36:30.315868 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.315828 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d"} err="failed to get container status \"59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d\": rpc error: code = NotFound desc = could not find container \"59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d\": container with ID starting with 59322a4c73615ef8945d0f2eb78759c69ac8fa0596de792583871c2e00eaa24d not found: ID does not exist" Apr 16 20:36:30.326567 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.326532 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" podStartSLOduration=1.326522148 podStartE2EDuration="1.326522148s" podCreationTimestamp="2026-04-16 20:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:30.326060793 +0000 UTC m=+545.297645783" watchObservedRunningTime="2026-04-16 20:36:30.326522148 +0000 UTC m=+545.298107138" Apr 16 20:36:30.464587 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.464560 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:36:30.467346 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.467332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.484602 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.484581 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:36:30.549606 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.549579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.549719 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.549630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxqr\" (UniqueName: \"kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.650314 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.650289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxqr\" (UniqueName: \"kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.650416 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.650336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.650618 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.650603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.661343 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.661322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxqr\" (UniqueName: \"kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-52lbs\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.776369 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.776345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:30.895454 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:30.895328 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:36:30.898532 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:36:30.898501 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda319fb83_b031_424e_bd0c_33be18ea0e0f.slice/crio-cead85bdbd4ed56835662d2fae022e9c59b1f164d3980a2c87523af18fcd549e WatchSource:0}: Error finding container cead85bdbd4ed56835662d2fae022e9c59b1f164d3980a2c87523af18fcd549e: Status 404 returned error can't find the container with id cead85bdbd4ed56835662d2fae022e9c59b1f164d3980a2c87523af18fcd549e Apr 16 20:36:31.313670 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:31.313505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" event={"ID":"a319fb83-b031-424e-bd0c-33be18ea0e0f","Type":"ContainerStarted","Data":"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6"} Apr 16 20:36:31.313670 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:31.313560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" event={"ID":"a319fb83-b031-424e-bd0c-33be18ea0e0f","Type":"ContainerStarted","Data":"cead85bdbd4ed56835662d2fae022e9c59b1f164d3980a2c87523af18fcd549e"} Apr 16 20:36:31.313670 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:31.313651 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:31.329911 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:31.329875 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" podStartSLOduration=1.329864845 podStartE2EDuration="1.329864845s" podCreationTimestamp="2026-04-16 20:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:31.329181883 +0000 UTC m=+546.300766875" watchObservedRunningTime="2026-04-16 20:36:31.329864845 +0000 UTC m=+546.301449835" Apr 16 20:36:41.316576 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:41.316548 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:42.319415 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.319390 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:36:42.368785 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.368759 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:42.368997 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.368974 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" podUID="9e0e5188-734f-4283-b72f-2b5f27c1a610" containerName="manager" containerID="cri-o://7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48" gracePeriod=10 Apr 16 20:36:42.601436 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.601418 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:42.636879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.636859 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume\") pod \"9e0e5188-734f-4283-b72f-2b5f27c1a610\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " Apr 16 20:36:42.637033 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.636886 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfwfw\" (UniqueName: \"kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw\") pod \"9e0e5188-734f-4283-b72f-2b5f27c1a610\" (UID: \"9e0e5188-734f-4283-b72f-2b5f27c1a610\") " Apr 16 20:36:42.637318 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.637280 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9e0e5188-734f-4283-b72f-2b5f27c1a610" (UID: "9e0e5188-734f-4283-b72f-2b5f27c1a610"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:42.639090 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.639066 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw" (OuterVolumeSpecName: "kube-api-access-mfwfw") pod "9e0e5188-734f-4283-b72f-2b5f27c1a610" (UID: "9e0e5188-734f-4283-b72f-2b5f27c1a610"). InnerVolumeSpecName "kube-api-access-mfwfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:42.738290 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.738270 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0e5188-734f-4283-b72f-2b5f27c1a610-extensions-socket-volume\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:36:42.738290 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:42.738289 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfwfw\" (UniqueName: \"kubernetes.io/projected/9e0e5188-734f-4283-b72f-2b5f27c1a610-kube-api-access-mfwfw\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:36:43.356765 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.356736 2578 generic.go:358] "Generic (PLEG): container finished" podID="9e0e5188-734f-4283-b72f-2b5f27c1a610" containerID="7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48" exitCode=0 Apr 16 20:36:43.357149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.356799 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" Apr 16 20:36:43.357149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.356823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" event={"ID":"9e0e5188-734f-4283-b72f-2b5f27c1a610","Type":"ContainerDied","Data":"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48"} Apr 16 20:36:43.357149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.356861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn" event={"ID":"9e0e5188-734f-4283-b72f-2b5f27c1a610","Type":"ContainerDied","Data":"2d17f14c68fabd41bb94458f20e6a57ba4ced807d85469ff15b06573efd15209"} Apr 16 20:36:43.357149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.356877 2578 scope.go:117] "RemoveContainer" containerID="7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48" Apr 16 20:36:43.365153 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.365137 2578 scope.go:117] "RemoveContainer" containerID="7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48" Apr 16 20:36:43.365431 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:36:43.365414 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48\": container with ID starting with 7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48 not found: ID does not exist" containerID="7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48" Apr 16 20:36:43.365475 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.365438 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48"} err="failed to get container status \"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48\": rpc error: code = NotFound desc = could not find container \"7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48\": container with ID starting with 7baa0f3a56095160c8f5ba9b93eea69303fee75f222da3c561b9a8ed1e116e48 not found: ID does not exist" Apr 16 20:36:43.379330 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.379301 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:43.387062 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.387029 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-96dxn"] Apr 16 20:36:43.605398 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:36:43.605370 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0e5188-734f-4283-b72f-2b5f27c1a610" path="/var/lib/kubelet/pods/9e0e5188-734f-4283-b72f-2b5f27c1a610/volumes" Apr 16 20:37:02.939687 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.939654 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:02.940097 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.939986 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e0e5188-734f-4283-b72f-2b5f27c1a610" containerName="manager" Apr 16 20:37:02.940097 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.939998 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0e5188-734f-4283-b72f-2b5f27c1a610" containerName="manager" Apr 16 20:37:02.940097 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.940069 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e0e5188-734f-4283-b72f-2b5f27c1a610" containerName="manager" Apr 16 20:37:02.943842 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.943825 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:02.946348 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.946322 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-w2krl\"" Apr 16 20:37:02.946348 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.946338 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:37:02.955224 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.955202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:02.987645 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.987619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:02.987773 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:02.987683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcgb\" (UniqueName: \"kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.001749 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.001721 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:03.088453 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.088424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcgb\" (UniqueName: \"kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.088615 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.088472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.089059 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.089042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.099022 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.098997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcgb\" (UniqueName: \"kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb\") pod \"limitador-limitador-7d549b5b-6785d\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.253741 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.253657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:03.382187 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.382162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:03.384124 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:03.384095 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57939c9_6ef2_407c_856c_576e6684c506.slice/crio-824243004c1dcc0470b90acbeff674709c0420d48bf9c064ed905cd0e4d472eb WatchSource:0}: Error finding container 824243004c1dcc0470b90acbeff674709c0420d48bf9c064ed905cd0e4d472eb: Status 404 returned error can't find the container with id 824243004c1dcc0470b90acbeff674709c0420d48bf9c064ed905cd0e4d472eb Apr 16 20:37:03.421737 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:03.421704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" event={"ID":"f57939c9-6ef2-407c-856c-576e6684c506","Type":"ContainerStarted","Data":"824243004c1dcc0470b90acbeff674709c0420d48bf9c064ed905cd0e4d472eb"} Apr 16 20:37:06.434088 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:06.434053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" event={"ID":"f57939c9-6ef2-407c-856c-576e6684c506","Type":"ContainerStarted","Data":"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73"} Apr 16 20:37:06.434553 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:06.434189 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:06.449755 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:06.449711 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" podStartSLOduration=1.784168457 podStartE2EDuration="4.449696261s" podCreationTimestamp="2026-04-16 20:37:02 +0000 UTC" firstStartedPulling="2026-04-16 20:37:03.385841356 +0000 UTC m=+578.357426324" lastFinishedPulling="2026-04-16 20:37:06.05136916 +0000 UTC m=+581.022954128" observedRunningTime="2026-04-16 20:37:06.448691897 +0000 UTC m=+581.420276910" watchObservedRunningTime="2026-04-16 20:37:06.449696261 +0000 UTC m=+581.421281252" Apr 16 20:37:17.438675 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:17.438644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:18.221189 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.221154 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:18.221391 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.221355 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" podUID="f57939c9-6ef2-407c-856c-576e6684c506" containerName="limitador" containerID="cri-o://5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73" gracePeriod=30 Apr 16 20:37:18.748037 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.748016 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:18.905788 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.905761 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rcgb\" (UniqueName: \"kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb\") pod \"f57939c9-6ef2-407c-856c-576e6684c506\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " Apr 16 20:37:18.905935 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.905801 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file\") pod \"f57939c9-6ef2-407c-856c-576e6684c506\" (UID: \"f57939c9-6ef2-407c-856c-576e6684c506\") " Apr 16 20:37:18.906205 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.906176 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file" (OuterVolumeSpecName: "config-file") pod "f57939c9-6ef2-407c-856c-576e6684c506" (UID: "f57939c9-6ef2-407c-856c-576e6684c506"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:37:18.907795 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:18.907764 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb" (OuterVolumeSpecName: "kube-api-access-7rcgb") pod "f57939c9-6ef2-407c-856c-576e6684c506" (UID: "f57939c9-6ef2-407c-856c-576e6684c506"). InnerVolumeSpecName "kube-api-access-7rcgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:19.006471 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.006448 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rcgb\" (UniqueName: \"kubernetes.io/projected/f57939c9-6ef2-407c-856c-576e6684c506-kube-api-access-7rcgb\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:37:19.006471 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.006469 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f57939c9-6ef2-407c-856c-576e6684c506-config-file\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:37:19.480548 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.480513 2578 generic.go:358] "Generic (PLEG): container finished" podID="f57939c9-6ef2-407c-856c-576e6684c506" containerID="5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73" exitCode=0 Apr 16 20:37:19.480705 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.480580 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" Apr 16 20:37:19.480705 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.480600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" event={"ID":"f57939c9-6ef2-407c-856c-576e6684c506","Type":"ContainerDied","Data":"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73"} Apr 16 20:37:19.480705 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.480637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6785d" event={"ID":"f57939c9-6ef2-407c-856c-576e6684c506","Type":"ContainerDied","Data":"824243004c1dcc0470b90acbeff674709c0420d48bf9c064ed905cd0e4d472eb"} Apr 16 20:37:19.480705 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.480658 2578 scope.go:117] "RemoveContainer" containerID="5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73" Apr 16 20:37:19.489167 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.489152 2578 scope.go:117] "RemoveContainer" containerID="5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73" Apr 16 20:37:19.489389 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:37:19.489373 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73\": container with ID starting with 5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73 not found: ID does not exist" containerID="5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73" Apr 16 20:37:19.489436 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.489396 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73"} err="failed to get container status \"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73\": rpc error: code = NotFound desc = could not find container \"5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73\": container with ID starting with 5c82c3825ad518774edc4c07a1e2b139cbd04b89963aba841d9a2967ac81ae73 not found: ID does not exist" Apr 16 20:37:19.500008 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.499985 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:19.504064 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.504035 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6785d"] Apr 16 20:37:19.605045 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:19.605024 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57939c9-6ef2-407c-856c-576e6684c506" path="/var/lib/kubelet/pods/f57939c9-6ef2-407c-856c-576e6684c506/volumes" Apr 16 20:37:24.014080 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.014048 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-n7bdg"] Apr 16 20:37:24.014438 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.014336 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f57939c9-6ef2-407c-856c-576e6684c506" containerName="limitador" Apr 16 20:37:24.014438 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.014347 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57939c9-6ef2-407c-856c-576e6684c506" containerName="limitador" Apr 16 20:37:24.014438 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.014397 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f57939c9-6ef2-407c-856c-576e6684c506" containerName="limitador" Apr 16 20:37:24.019008 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.018985 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.021509 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.021487 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 20:37:24.021625 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.021487 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-98dcb\"" Apr 16 20:37:24.025360 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.025336 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-n7bdg"] Apr 16 20:37:24.145133 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.145101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhpx\" (UniqueName: \"kubernetes.io/projected/59849ccb-da40-46da-bade-0bf06395a88f-kube-api-access-pzhpx\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.145290 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.145143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59849ccb-da40-46da-bade-0bf06395a88f-data\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.245706 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.245664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59849ccb-da40-46da-bade-0bf06395a88f-data\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.245901 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.245753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhpx\" (UniqueName: \"kubernetes.io/projected/59849ccb-da40-46da-bade-0bf06395a88f-kube-api-access-pzhpx\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.246122 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.246101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59849ccb-da40-46da-bade-0bf06395a88f-data\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.254126 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.254101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhpx\" (UniqueName: \"kubernetes.io/projected/59849ccb-da40-46da-bade-0bf06395a88f-kube-api-access-pzhpx\") pod \"postgres-868db5846d-n7bdg\" (UID: \"59849ccb-da40-46da-bade-0bf06395a88f\") " pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.331703 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.331676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:24.451440 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.451416 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-n7bdg"] Apr 16 20:37:24.454166 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:24.454144 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59849ccb_da40_46da_bade_0bf06395a88f.slice/crio-d207075953d6d4517cc3b513ba8ebafcfadb094fcb55062d3564cbb344454cfc WatchSource:0}: Error finding container d207075953d6d4517cc3b513ba8ebafcfadb094fcb55062d3564cbb344454cfc: Status 404 returned error can't find the container with id d207075953d6d4517cc3b513ba8ebafcfadb094fcb55062d3564cbb344454cfc Apr 16 20:37:24.498752 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:24.498722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-n7bdg" event={"ID":"59849ccb-da40-46da-bade-0bf06395a88f","Type":"ContainerStarted","Data":"d207075953d6d4517cc3b513ba8ebafcfadb094fcb55062d3564cbb344454cfc"} Apr 16 20:37:29.555655 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:29.555633 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 20:37:30.519443 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:30.519407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-n7bdg" event={"ID":"59849ccb-da40-46da-bade-0bf06395a88f","Type":"ContainerStarted","Data":"2863d473467ceff653976bc53e3478bfa4d051cb600d2d56120e6879b74c0132"} Apr 16 20:37:30.519599 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:30.519458 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:30.537430 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:30.537383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-n7bdg" podStartSLOduration=2.440284793 podStartE2EDuration="7.537369069s" podCreationTimestamp="2026-04-16 20:37:23 +0000 UTC" firstStartedPulling="2026-04-16 20:37:24.455887762 +0000 UTC m=+599.427472729" lastFinishedPulling="2026-04-16 20:37:29.552972037 +0000 UTC m=+604.524557005" observedRunningTime="2026-04-16 20:37:30.535426554 +0000 UTC m=+605.507011556" watchObservedRunningTime="2026-04-16 20:37:30.537369069 +0000 UTC m=+605.508954109" Apr 16 20:37:36.552388 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:36.552353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-n7bdg" Apr 16 20:37:37.322593 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.322563 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:37.326172 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.326155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.328471 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.328441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 20:37:37.328588 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.328524 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 20:37:37.328588 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.328524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6dqgw\"" Apr 16 20:37:37.334217 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.334196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:37.347209 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.347185 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:37:37.349254 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.349234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:37.351675 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.351655 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-5kf7h\"" Apr 16 20:37:37.359170 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.359151 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:37:37.444442 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.444416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtw9g\" (UniqueName: \"kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g\") pod \"maas-controller-54dcbffbf4-9cxbz\" (UID: \"6be97c07-ec80-40e6-919b-f1e2a353b5b5\") " pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:37.444563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.444471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.444563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.444539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj6w\" (UniqueName: \"kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.547597 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.547569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj6w\" (UniqueName: \"kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.547735 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.547605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtw9g\" (UniqueName: \"kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g\") pod \"maas-controller-54dcbffbf4-9cxbz\" (UID: \"6be97c07-ec80-40e6-919b-f1e2a353b5b5\") " pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:37.547735 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.547650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.547848 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:37:37.547738 2578 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 16 20:37:37.547848 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:37:37.547799 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls podName:3fb2b337-d89d-430e-8919-ca848b54ab0e nodeName:}" failed. No retries permitted until 2026-04-16 20:37:38.047781763 +0000 UTC m=+613.019366747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls") pod "maas-api-74cc6fcf45-fhbp8" (UID: "3fb2b337-d89d-430e-8919-ca848b54ab0e") : secret "maas-api-serving-cert" not found Apr 16 20:37:37.556164 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.556137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtw9g\" (UniqueName: \"kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g\") pod \"maas-controller-54dcbffbf4-9cxbz\" (UID: \"6be97c07-ec80-40e6-919b-f1e2a353b5b5\") " pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:37.556455 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.556266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj6w\" (UniqueName: \"kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:37.660795 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.660774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:37.779420 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:37.779396 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:37:37.781785 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:37.781757 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be97c07_ec80_40e6_919b_f1e2a353b5b5.slice/crio-982da99f3d00fc8b2048740b02c29820015ff12d1aca38df2d3209535d628578 WatchSource:0}: Error finding container 982da99f3d00fc8b2048740b02c29820015ff12d1aca38df2d3209535d628578: Status 404 returned error can't find the container with id 982da99f3d00fc8b2048740b02c29820015ff12d1aca38df2d3209535d628578 Apr 16 20:37:38.052987 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.052903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:38.055186 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.055160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") pod \"maas-api-74cc6fcf45-fhbp8\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:38.187142 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.187113 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-664d5d4d98-jtpgq"] Apr 16 20:37:38.189815 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.189797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.197125 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.196976 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-664d5d4d98-jtpgq"] Apr 16 20:37:38.236739 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.236720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:38.355649 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.355594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e473e5bc-9222-4108-ab8f-d7caeffcfb06-maas-api-tls\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.355649 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.355645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgdzd\" (UniqueName: \"kubernetes.io/projected/e473e5bc-9222-4108-ab8f-d7caeffcfb06-kube-api-access-lgdzd\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.368876 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.368848 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:38.372658 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:38.372628 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fb2b337_d89d_430e_8919_ca848b54ab0e.slice/crio-28209c2c7921836fafffa95edd7f7384d36af1eeb1b5bcbb39f9ba2e6811e720 WatchSource:0}: Error finding container 28209c2c7921836fafffa95edd7f7384d36af1eeb1b5bcbb39f9ba2e6811e720: Status 404 returned error can't find the container with id 28209c2c7921836fafffa95edd7f7384d36af1eeb1b5bcbb39f9ba2e6811e720 Apr 16 20:37:38.457228 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.457072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e473e5bc-9222-4108-ab8f-d7caeffcfb06-maas-api-tls\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.457228 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.457124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgdzd\" (UniqueName: \"kubernetes.io/projected/e473e5bc-9222-4108-ab8f-d7caeffcfb06-kube-api-access-lgdzd\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.460134 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.460087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e473e5bc-9222-4108-ab8f-d7caeffcfb06-maas-api-tls\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.467725 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.467703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgdzd\" (UniqueName: \"kubernetes.io/projected/e473e5bc-9222-4108-ab8f-d7caeffcfb06-kube-api-access-lgdzd\") pod \"maas-api-664d5d4d98-jtpgq\" (UID: \"e473e5bc-9222-4108-ab8f-d7caeffcfb06\") " pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.501610 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.501587 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:38.555524 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.555463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" event={"ID":"3fb2b337-d89d-430e-8919-ca848b54ab0e","Type":"ContainerStarted","Data":"28209c2c7921836fafffa95edd7f7384d36af1eeb1b5bcbb39f9ba2e6811e720"} Apr 16 20:37:38.558192 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.558131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" event={"ID":"6be97c07-ec80-40e6-919b-f1e2a353b5b5","Type":"ContainerStarted","Data":"982da99f3d00fc8b2048740b02c29820015ff12d1aca38df2d3209535d628578"} Apr 16 20:37:38.665550 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:38.665525 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-664d5d4d98-jtpgq"] Apr 16 20:37:38.668307 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:38.668274 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode473e5bc_9222_4108_ab8f_d7caeffcfb06.slice/crio-3f709a8096ce20b1a045abd792b71530812088d9e4ebcc611c7dffac44d88770 WatchSource:0}: Error finding container 3f709a8096ce20b1a045abd792b71530812088d9e4ebcc611c7dffac44d88770: Status 404 returned error can't find the container with id 3f709a8096ce20b1a045abd792b71530812088d9e4ebcc611c7dffac44d88770 Apr 16 20:37:39.565532 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:39.565482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-664d5d4d98-jtpgq" event={"ID":"e473e5bc-9222-4108-ab8f-d7caeffcfb06","Type":"ContainerStarted","Data":"3f709a8096ce20b1a045abd792b71530812088d9e4ebcc611c7dffac44d88770"} Apr 16 20:37:41.573205 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.573116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" event={"ID":"3fb2b337-d89d-430e-8919-ca848b54ab0e","Type":"ContainerStarted","Data":"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7"} Apr 16 20:37:41.573634 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.573272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:41.574547 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.574517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" event={"ID":"6be97c07-ec80-40e6-919b-f1e2a353b5b5","Type":"ContainerStarted","Data":"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36"} Apr 16 20:37:41.574682 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.574593 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:41.575661 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.575639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-664d5d4d98-jtpgq" event={"ID":"e473e5bc-9222-4108-ab8f-d7caeffcfb06","Type":"ContainerStarted","Data":"9ee4b795643258f933ef03bbcdb75ab444236db146119d374e935217bea01889"} Apr 16 20:37:41.575760 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.575751 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:41.597513 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.597453 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" podStartSLOduration=1.675684712 podStartE2EDuration="4.597437997s" podCreationTimestamp="2026-04-16 20:37:37 +0000 UTC" firstStartedPulling="2026-04-16 20:37:38.374180364 +0000 UTC m=+613.345765331" lastFinishedPulling="2026-04-16 20:37:41.295933636 +0000 UTC m=+616.267518616" observedRunningTime="2026-04-16 20:37:41.596629083 +0000 UTC m=+616.568214075" watchObservedRunningTime="2026-04-16 20:37:41.597437997 +0000 UTC m=+616.569022987" Apr 16 20:37:41.613091 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.613046 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" podStartSLOduration=1.100192313 podStartE2EDuration="4.613036455s" podCreationTimestamp="2026-04-16 20:37:37 +0000 UTC" firstStartedPulling="2026-04-16 20:37:37.783396757 +0000 UTC m=+612.754981725" lastFinishedPulling="2026-04-16 20:37:41.296240885 +0000 UTC m=+616.267825867" observedRunningTime="2026-04-16 20:37:41.611684951 +0000 UTC m=+616.583269944" watchObservedRunningTime="2026-04-16 20:37:41.613036455 +0000 UTC m=+616.584621445" Apr 16 20:37:41.628520 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:41.628483 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-664d5d4d98-jtpgq" podStartSLOduration=0.99649467 podStartE2EDuration="3.628472456s" podCreationTimestamp="2026-04-16 20:37:38 +0000 UTC" firstStartedPulling="2026-04-16 20:37:38.670455997 +0000 UTC m=+613.642040971" lastFinishedPulling="2026-04-16 20:37:41.302433785 +0000 UTC m=+616.274018757" observedRunningTime="2026-04-16 20:37:41.626666443 +0000 UTC m=+616.598251434" watchObservedRunningTime="2026-04-16 20:37:41.628472456 +0000 UTC m=+616.600057445" Apr 16 20:37:47.585115 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:47.585035 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:47.585873 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:47.585855 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-664d5d4d98-jtpgq" Apr 16 20:37:47.638918 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:47.638887 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:47.639498 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:47.639402 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" podUID="3fb2b337-d89d-430e-8919-ca848b54ab0e" containerName="maas-api" containerID="cri-o://e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7" gracePeriod=30 Apr 16 20:37:47.866442 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:47.866423 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:48.021174 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.021148 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") pod \"3fb2b337-d89d-430e-8919-ca848b54ab0e\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " Apr 16 20:37:48.021300 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.021224 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmj6w\" (UniqueName: \"kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w\") pod \"3fb2b337-d89d-430e-8919-ca848b54ab0e\" (UID: \"3fb2b337-d89d-430e-8919-ca848b54ab0e\") " Apr 16 20:37:48.023309 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.023283 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w" (OuterVolumeSpecName: "kube-api-access-zmj6w") pod "3fb2b337-d89d-430e-8919-ca848b54ab0e" (UID: "3fb2b337-d89d-430e-8919-ca848b54ab0e"). InnerVolumeSpecName "kube-api-access-zmj6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:48.023425 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.023325 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "3fb2b337-d89d-430e-8919-ca848b54ab0e" (UID: "3fb2b337-d89d-430e-8919-ca848b54ab0e"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:37:48.121858 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.121807 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zmj6w\" (UniqueName: \"kubernetes.io/projected/3fb2b337-d89d-430e-8919-ca848b54ab0e-kube-api-access-zmj6w\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:37:48.121858 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.121828 2578 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb2b337-d89d-430e-8919-ca848b54ab0e-maas-api-tls\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:37:48.600045 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.600006 2578 generic.go:358] "Generic (PLEG): container finished" podID="3fb2b337-d89d-430e-8919-ca848b54ab0e" containerID="e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7" exitCode=0 Apr 16 20:37:48.600502 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.600085 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" Apr 16 20:37:48.600502 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.600100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" event={"ID":"3fb2b337-d89d-430e-8919-ca848b54ab0e","Type":"ContainerDied","Data":"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7"} Apr 16 20:37:48.600502 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.600158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74cc6fcf45-fhbp8" event={"ID":"3fb2b337-d89d-430e-8919-ca848b54ab0e","Type":"ContainerDied","Data":"28209c2c7921836fafffa95edd7f7384d36af1eeb1b5bcbb39f9ba2e6811e720"} Apr 16 20:37:48.600502 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.600196 2578 scope.go:117] "RemoveContainer" containerID="e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7" Apr 16 20:37:48.609582 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.609561 2578 scope.go:117] "RemoveContainer" containerID="e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7" Apr 16 20:37:48.609918 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:37:48.609894 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7\": container with ID starting with e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7 not found: ID does not exist" containerID="e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7" Apr 16 20:37:48.610043 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.609925 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7"} err="failed to get container status \"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7\": rpc error: code = NotFound desc = could not find container \"e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7\": container with ID starting with e3e536e4df8ff7891eb3c4392e89e0ed607b671416801feed61cecc6473f13f7 not found: ID does not exist" Apr 16 20:37:48.621586 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.621565 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:48.624934 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:48.624898 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-74cc6fcf45-fhbp8"] Apr 16 20:37:49.606421 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:49.606389 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb2b337-d89d-430e-8919-ca848b54ab0e" path="/var/lib/kubelet/pods/3fb2b337-d89d-430e-8919-ca848b54ab0e/volumes" Apr 16 20:37:52.584526 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.584496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:37:52.877665 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.877595 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:37:52.877963 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.877923 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fb2b337-d89d-430e-8919-ca848b54ab0e" containerName="maas-api" Apr 16 20:37:52.877963 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.877957 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb2b337-d89d-430e-8919-ca848b54ab0e" containerName="maas-api" Apr 16 20:37:52.878056 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.878012 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fb2b337-d89d-430e-8919-ca848b54ab0e" containerName="maas-api" Apr 16 20:37:52.880737 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.880715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:52.889644 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.889620 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:37:52.955982 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:52.955936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmr8n\" (UniqueName: \"kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n\") pod \"maas-controller-96d5c496f-6kfhh\" (UID: \"7d116cc3-ef4b-42c4-9230-5b79198281f3\") " pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:53.056760 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:53.056735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmr8n\" (UniqueName: \"kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n\") pod \"maas-controller-96d5c496f-6kfhh\" (UID: \"7d116cc3-ef4b-42c4-9230-5b79198281f3\") " pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:53.065040 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:53.065007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmr8n\" (UniqueName: \"kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n\") pod \"maas-controller-96d5c496f-6kfhh\" (UID: \"7d116cc3-ef4b-42c4-9230-5b79198281f3\") " pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:53.191643 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:53.191570 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:53.326641 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:53.326608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:37:53.329376 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:37:53.329345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d116cc3_ef4b_42c4_9230_5b79198281f3.slice/crio-9ea09f376e07e5f7769d996130452e110b82d90ceaab6643f091012be5cbc9d6 WatchSource:0}: Error finding container 9ea09f376e07e5f7769d996130452e110b82d90ceaab6643f091012be5cbc9d6: Status 404 returned error can't find the container with id 9ea09f376e07e5f7769d996130452e110b82d90ceaab6643f091012be5cbc9d6 Apr 16 20:37:53.619264 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:53.619235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-6kfhh" event={"ID":"7d116cc3-ef4b-42c4-9230-5b79198281f3","Type":"ContainerStarted","Data":"9ea09f376e07e5f7769d996130452e110b82d90ceaab6643f091012be5cbc9d6"} Apr 16 20:37:54.624014 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:54.623979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-6kfhh" event={"ID":"7d116cc3-ef4b-42c4-9230-5b79198281f3","Type":"ContainerStarted","Data":"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d"} Apr 16 20:37:54.624428 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:54.624039 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:37:54.640452 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:37:54.640398 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-96d5c496f-6kfhh" podStartSLOduration=2.3067824740000002 podStartE2EDuration="2.640382931s" podCreationTimestamp="2026-04-16 20:37:52 +0000 UTC" firstStartedPulling="2026-04-16 20:37:53.330562679 +0000 UTC m=+628.302147647" lastFinishedPulling="2026-04-16 20:37:53.664163133 +0000 UTC m=+628.635748104" observedRunningTime="2026-04-16 20:37:54.638974101 +0000 UTC m=+629.610559081" watchObservedRunningTime="2026-04-16 20:37:54.640382931 +0000 UTC m=+629.611967921" Apr 16 20:38:05.632688 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.632650 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:38:05.673794 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.673762 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:38:05.674018 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.673980 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" podUID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" containerName="manager" containerID="cri-o://591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36" gracePeriod=10 Apr 16 20:38:05.912223 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.912203 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:38:05.937684 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.937659 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtw9g\" (UniqueName: \"kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g\") pod \"6be97c07-ec80-40e6-919b-f1e2a353b5b5\" (UID: \"6be97c07-ec80-40e6-919b-f1e2a353b5b5\") " Apr 16 20:38:05.939748 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:05.939723 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g" (OuterVolumeSpecName: "kube-api-access-xtw9g") pod "6be97c07-ec80-40e6-919b-f1e2a353b5b5" (UID: "6be97c07-ec80-40e6-919b-f1e2a353b5b5"). InnerVolumeSpecName "kube-api-access-xtw9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:38:06.038332 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.038309 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xtw9g\" (UniqueName: \"kubernetes.io/projected/6be97c07-ec80-40e6-919b-f1e2a353b5b5-kube-api-access-xtw9g\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:38:06.665995 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.665925 2578 generic.go:358] "Generic (PLEG): container finished" podID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" containerID="591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36" exitCode=0 Apr 16 20:38:06.666415 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.665988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" event={"ID":"6be97c07-ec80-40e6-919b-f1e2a353b5b5","Type":"ContainerDied","Data":"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36"} Apr 16 20:38:06.666415 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.666039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" event={"ID":"6be97c07-ec80-40e6-919b-f1e2a353b5b5","Type":"ContainerDied","Data":"982da99f3d00fc8b2048740b02c29820015ff12d1aca38df2d3209535d628578"} Apr 16 20:38:06.666415 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.666049 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dcbffbf4-9cxbz" Apr 16 20:38:06.666415 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.666058 2578 scope.go:117] "RemoveContainer" containerID="591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36" Apr 16 20:38:06.674209 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.674186 2578 scope.go:117] "RemoveContainer" containerID="591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36" Apr 16 20:38:06.674531 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:38:06.674507 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36\": container with ID starting with 591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36 not found: ID does not exist" containerID="591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36" Apr 16 20:38:06.674597 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.674541 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36"} err="failed to get container status \"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36\": rpc error: code = NotFound desc = could not find container \"591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36\": container with ID starting with 591ba9fea6456bbc54d071a48b655922ced196947dd0fb66eb902ff265ba5b36 not found: ID does not exist" Apr 16 20:38:06.690856 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.690833 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:38:06.696336 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:06.696309 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-54dcbffbf4-9cxbz"] Apr 16 20:38:07.605054 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:07.605015 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" path="/var/lib/kubelet/pods/6be97c07-ec80-40e6-919b-f1e2a353b5b5/volumes" Apr 16 20:38:18.101895 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.101860 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq"] Apr 16 20:38:18.102334 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.102210 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" containerName="manager" Apr 16 20:38:18.102334 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.102224 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" containerName="manager" Apr 16 20:38:18.102334 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.102299 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6be97c07-ec80-40e6-919b-f1e2a353b5b5" containerName="manager" Apr 16 20:38:18.104478 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.104463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.107842 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.107818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gmzfc\"" Apr 16 20:38:18.107988 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.107841 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 20:38:18.107988 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.107820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 20:38:18.107988 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.107965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 20:38:18.115705 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.115682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq"] Apr 16 20:38:18.122459 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.122563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.122563 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.122633 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prx59\" (UniqueName: \"kubernetes.io/projected/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kube-api-access-prx59\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.122633 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.122695 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.122635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.223751 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.223884 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.223884 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.223884 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prx59\" (UniqueName: \"kubernetes.io/projected/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kube-api-access-prx59\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.224102 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.224102 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.223986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.224215 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.224191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.224272 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.224248 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.224362 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.224342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.226149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.226129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.226149 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.226142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.233065 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.233038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prx59\" (UniqueName: \"kubernetes.io/projected/76ccf9ea-e5b1-4459-a8c0-7ec36631d056-kube-api-access-prx59\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq\" (UID: \"76ccf9ea-e5b1-4459-a8c0-7ec36631d056\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.421172 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.421107 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:18.544604 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.544580 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq"] Apr 16 20:38:18.547092 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:38:18.547066 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ccf9ea_e5b1_4459_a8c0_7ec36631d056.slice/crio-aec09dc7e80ea11197b576c3d0c2ecfe526947e4fbae9515cad9fa78f403ae27 WatchSource:0}: Error finding container aec09dc7e80ea11197b576c3d0c2ecfe526947e4fbae9515cad9fa78f403ae27: Status 404 returned error can't find the container with id aec09dc7e80ea11197b576c3d0c2ecfe526947e4fbae9515cad9fa78f403ae27 Apr 16 20:38:18.702987 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:18.702904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" event={"ID":"76ccf9ea-e5b1-4459-a8c0-7ec36631d056","Type":"ContainerStarted","Data":"aec09dc7e80ea11197b576c3d0c2ecfe526947e4fbae9515cad9fa78f403ae27"} Apr 16 20:38:24.723766 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:24.723726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" event={"ID":"76ccf9ea-e5b1-4459-a8c0-7ec36631d056","Type":"ContainerStarted","Data":"6344eb19edaf79d44b03b32af28186458f3bfba09de195906710819a2bd9414e"} Apr 16 20:38:30.375976 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.375919 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd"] Apr 16 20:38:30.378833 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.378813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.381294 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.381272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 20:38:30.387141 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.387120 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd"] Apr 16 20:38:30.419544 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc0b816-a429-4048-bf65-225f17afeec5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.419627 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.419627 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.419627 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.419729 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.419729 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.419716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmxz2\" (UniqueName: \"kubernetes.io/projected/8cc0b816-a429-4048-bf65-225f17afeec5-kube-api-access-bmxz2\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.520853 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.520853 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmxz2\" (UniqueName: \"kubernetes.io/projected/8cc0b816-a429-4048-bf65-225f17afeec5-kube-api-access-bmxz2\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521052 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc0b816-a429-4048-bf65-225f17afeec5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521052 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521052 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521052 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.520988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521279 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.521230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521339 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.521308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.521398 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.521380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.523105 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.523086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8cc0b816-a429-4048-bf65-225f17afeec5-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.523355 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.523338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8cc0b816-a429-4048-bf65-225f17afeec5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.530048 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.530025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmxz2\" (UniqueName: \"kubernetes.io/projected/8cc0b816-a429-4048-bf65-225f17afeec5-kube-api-access-bmxz2\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd\" (UID: \"8cc0b816-a429-4048-bf65-225f17afeec5\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.690695 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.690644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:30.748088 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.748046 2578 generic.go:358] "Generic (PLEG): container finished" podID="76ccf9ea-e5b1-4459-a8c0-7ec36631d056" containerID="6344eb19edaf79d44b03b32af28186458f3bfba09de195906710819a2bd9414e" exitCode=0 Apr 16 20:38:30.748202 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.748135 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" event={"ID":"76ccf9ea-e5b1-4459-a8c0-7ec36631d056","Type":"ContainerDied","Data":"6344eb19edaf79d44b03b32af28186458f3bfba09de195906710819a2bd9414e"} Apr 16 20:38:30.819110 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:30.818982 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd"] Apr 16 20:38:30.821335 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:38:30.821307 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc0b816_a429_4048_bf65_225f17afeec5.slice/crio-39dd22d5af53f625989f0098bdc1f6374682f97e0c55d19f6985ec6c5fac1834 WatchSource:0}: Error finding container 39dd22d5af53f625989f0098bdc1f6374682f97e0c55d19f6985ec6c5fac1834: Status 404 returned error can't find the container with id 39dd22d5af53f625989f0098bdc1f6374682f97e0c55d19f6985ec6c5fac1834 Apr 16 20:38:31.755102 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:31.754932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" event={"ID":"8cc0b816-a429-4048-bf65-225f17afeec5","Type":"ContainerStarted","Data":"0e168c49099285a64cb61104dcb8b51675b21ec40e0e8a05797e491189e91412"} Apr 16 20:38:31.755102 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:31.755007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" event={"ID":"8cc0b816-a429-4048-bf65-225f17afeec5","Type":"ContainerStarted","Data":"39dd22d5af53f625989f0098bdc1f6374682f97e0c55d19f6985ec6c5fac1834"} Apr 16 20:38:35.771189 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:35.771144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" event={"ID":"76ccf9ea-e5b1-4459-a8c0-7ec36631d056","Type":"ContainerStarted","Data":"d23dae660648269b5b75dce54a3cd7fbb86353fe39d0f9616ef4f19b235a7fcf"} Apr 16 20:38:35.771609 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:35.771399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:35.789507 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:35.789468 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" podStartSLOduration=1.274969274 podStartE2EDuration="17.789454169s" podCreationTimestamp="2026-04-16 20:38:18 +0000 UTC" firstStartedPulling="2026-04-16 20:38:18.548819375 +0000 UTC m=+653.520404343" lastFinishedPulling="2026-04-16 20:38:35.06330427 +0000 UTC m=+670.034889238" observedRunningTime="2026-04-16 20:38:35.787819184 +0000 UTC m=+670.759404173" watchObservedRunningTime="2026-04-16 20:38:35.789454169 +0000 UTC m=+670.761039158" Apr 16 20:38:37.778716 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:37.778682 2578 generic.go:358] "Generic (PLEG): container finished" podID="8cc0b816-a429-4048-bf65-225f17afeec5" containerID="0e168c49099285a64cb61104dcb8b51675b21ec40e0e8a05797e491189e91412" exitCode=0 Apr 16 20:38:37.779089 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:37.778751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" event={"ID":"8cc0b816-a429-4048-bf65-225f17afeec5","Type":"ContainerDied","Data":"0e168c49099285a64cb61104dcb8b51675b21ec40e0e8a05797e491189e91412"} Apr 16 20:38:38.783325 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:38.783297 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" event={"ID":"8cc0b816-a429-4048-bf65-225f17afeec5","Type":"ContainerStarted","Data":"ed0bd855650b715320275d127571ddacd9dd20288488ffa2d2853634883abbfb"} Apr 16 20:38:38.783713 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:38.783496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:38.802206 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:38.802162 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" podStartSLOduration=8.61870736 podStartE2EDuration="8.802149807s" podCreationTimestamp="2026-04-16 20:38:30 +0000 UTC" firstStartedPulling="2026-04-16 20:38:37.779308118 +0000 UTC m=+672.750893086" lastFinishedPulling="2026-04-16 20:38:37.962750564 +0000 UTC m=+672.934335533" observedRunningTime="2026-04-16 20:38:38.800342463 +0000 UTC m=+673.771927487" watchObservedRunningTime="2026-04-16 20:38:38.802149807 +0000 UTC m=+673.773734856" Apr 16 20:38:46.792404 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:46.792374 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq" Apr 16 20:38:49.800250 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:49.800221 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd" Apr 16 20:38:50.677198 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.677159 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr"] Apr 16 20:38:50.679805 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.679781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.682178 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.682157 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 20:38:50.690526 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.690506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr"] Apr 16 20:38:50.784144 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.784284 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6636e968-8530-4940-b604-5f950383d282-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.784284 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xph\" (UniqueName: \"kubernetes.io/projected/6636e968-8530-4940-b604-5f950383d282-kube-api-access-h4xph\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.784284 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.784284 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.784284 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.784268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.884864 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.884786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6636e968-8530-4940-b604-5f950383d282-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.884912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xph\" (UniqueName: \"kubernetes.io/projected/6636e968-8530-4940-b604-5f950383d282-kube-api-access-h4xph\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.884961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.884999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.885035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.885100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885562 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.885528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885562 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.885543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.885730 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.885709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.887120 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.887103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6636e968-8530-4940-b604-5f950383d282-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.887310 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.887293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6636e968-8530-4940-b604-5f950383d282-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.892682 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.892656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xph\" (UniqueName: \"kubernetes.io/projected/6636e968-8530-4940-b604-5f950383d282-kube-api-access-h4xph\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-p8mzr\" (UID: \"6636e968-8530-4940-b604-5f950383d282\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:50.990135 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:50.990082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:51.130750 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:51.130710 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr"] Apr 16 20:38:51.132496 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:51.132479 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:38:51.826467 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:51.826434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" event={"ID":"6636e968-8530-4940-b604-5f950383d282","Type":"ContainerStarted","Data":"fd7ae229750a992c6282a7dac8a756b73af9f173f8cb35a1bfad7b7b60763c53"} Apr 16 20:38:51.826631 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:51.826475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" event={"ID":"6636e968-8530-4940-b604-5f950383d282","Type":"ContainerStarted","Data":"50e6f79335308c40c105181ba95d04c94a4aee26a4ad079cf76c70c2b9c1f19f"} Apr 16 20:38:56.572051 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.572021 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5"] Apr 16 20:38:56.580201 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.580028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.582998 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.582761 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 20:38:56.584028 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.583984 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5"] Apr 16 20:38:56.738669 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.738642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.738869 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.738847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a360e9-c092-45d7-912b-af887cdbe6a1-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.739088 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.739068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.739233 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.739216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.739357 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.739340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pq7\" (UniqueName: \"kubernetes.io/projected/d6a360e9-c092-45d7-912b-af887cdbe6a1-kube-api-access-z8pq7\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.739542 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.739525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840229 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pq7\" (UniqueName: \"kubernetes.io/projected/d6a360e9-c092-45d7-912b-af887cdbe6a1-kube-api-access-z8pq7\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840373 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840373 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840373 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a360e9-c092-45d7-912b-af887cdbe6a1-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840531 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.840924 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.840894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.841240 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.841220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.841966 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.841931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.843013 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.842989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6a360e9-c092-45d7-912b-af887cdbe6a1-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.843615 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.843596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6a360e9-c092-45d7-912b-af887cdbe6a1-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.845430 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.845399 2578 generic.go:358] "Generic (PLEG): container finished" podID="6636e968-8530-4940-b604-5f950383d282" containerID="fd7ae229750a992c6282a7dac8a756b73af9f173f8cb35a1bfad7b7b60763c53" exitCode=0 Apr 16 20:38:56.845556 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.845537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" event={"ID":"6636e968-8530-4940-b604-5f950383d282","Type":"ContainerDied","Data":"fd7ae229750a992c6282a7dac8a756b73af9f173f8cb35a1bfad7b7b60763c53"} Apr 16 20:38:56.848334 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.848314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pq7\" (UniqueName: \"kubernetes.io/projected/d6a360e9-c092-45d7-912b-af887cdbe6a1-kube-api-access-z8pq7\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5\" (UID: \"d6a360e9-c092-45d7-912b-af887cdbe6a1\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:56.947468 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:56.947445 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:38:57.083803 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.083777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5"] Apr 16 20:38:57.086675 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:38:57.086644 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a360e9_c092_45d7_912b_af887cdbe6a1.slice/crio-ac5be722e77f777467de5bb22c2596eb5a7c443d730d90e4094cccbf4dcf8811 WatchSource:0}: Error finding container ac5be722e77f777467de5bb22c2596eb5a7c443d730d90e4094cccbf4dcf8811: Status 404 returned error can't find the container with id ac5be722e77f777467de5bb22c2596eb5a7c443d730d90e4094cccbf4dcf8811 Apr 16 20:38:57.850844 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.850809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" event={"ID":"6636e968-8530-4940-b604-5f950383d282","Type":"ContainerStarted","Data":"99ad0a1f92a7bc4f64b465a86254e3b5c4d0785abc274f589a63276de2031a1e"} Apr 16 20:38:57.851292 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.851030 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:38:57.852294 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.852274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" event={"ID":"d6a360e9-c092-45d7-912b-af887cdbe6a1","Type":"ContainerStarted","Data":"df3961583088b75ae3df2e6f5a73c618937a707024e7271fd6b4f0aa22720d65"} Apr 16 20:38:57.852403 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.852297 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" event={"ID":"d6a360e9-c092-45d7-912b-af887cdbe6a1","Type":"ContainerStarted","Data":"ac5be722e77f777467de5bb22c2596eb5a7c443d730d90e4094cccbf4dcf8811"} Apr 16 20:38:57.870649 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:38:57.870602 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" podStartSLOduration=7.648633171 podStartE2EDuration="7.870590135s" podCreationTimestamp="2026-04-16 20:38:50 +0000 UTC" firstStartedPulling="2026-04-16 20:38:56.846302074 +0000 UTC m=+691.817887056" lastFinishedPulling="2026-04-16 20:38:57.068259052 +0000 UTC m=+692.039844020" observedRunningTime="2026-04-16 20:38:57.869304717 +0000 UTC m=+692.840889707" watchObservedRunningTime="2026-04-16 20:38:57.870590135 +0000 UTC m=+692.842175124" Apr 16 20:39:04.875020 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:04.874990 2578 generic.go:358] "Generic (PLEG): container finished" podID="d6a360e9-c092-45d7-912b-af887cdbe6a1" containerID="df3961583088b75ae3df2e6f5a73c618937a707024e7271fd6b4f0aa22720d65" exitCode=0 Apr 16 20:39:04.875376 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:04.875070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" event={"ID":"d6a360e9-c092-45d7-912b-af887cdbe6a1","Type":"ContainerDied","Data":"df3961583088b75ae3df2e6f5a73c618937a707024e7271fd6b4f0aa22720d65"} Apr 16 20:39:05.878849 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:05.878819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" event={"ID":"d6a360e9-c092-45d7-912b-af887cdbe6a1","Type":"ContainerStarted","Data":"42e8814fd8669422a2c38af5530228e1ed0be182554df359bf66ecdae90bcebd"} Apr 16 20:39:05.879210 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:05.879009 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:39:05.897609 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:05.897559 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" podStartSLOduration=9.661801373 podStartE2EDuration="9.897547165s" podCreationTimestamp="2026-04-16 20:38:56 +0000 UTC" firstStartedPulling="2026-04-16 20:39:04.875788952 +0000 UTC m=+699.847373924" lastFinishedPulling="2026-04-16 20:39:05.111534737 +0000 UTC m=+700.083119716" observedRunningTime="2026-04-16 20:39:05.895396288 +0000 UTC m=+700.866981277" watchObservedRunningTime="2026-04-16 20:39:05.897547165 +0000 UTC m=+700.869132154" Apr 16 20:39:08.868833 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:08.868804 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-p8mzr" Apr 16 20:39:16.901760 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:16.901681 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5" Apr 16 20:39:29.384258 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.384218 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg"] Apr 16 20:39:29.388752 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.388730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.391250 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.391230 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 20:39:29.395106 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.395085 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg"] Apr 16 20:39:29.476031 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.476171 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.476171 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.476171 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vv7\" (UniqueName: \"kubernetes.io/projected/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kube-api-access-m5vv7\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.476289 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.476289 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.476230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577186 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577307 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577307 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577307 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577465 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vv7\" (UniqueName: \"kubernetes.io/projected/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kube-api-access-m5vv7\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577465 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577582 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577696 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.577696 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.577628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.579444 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.579424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.579742 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.579712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.585535 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.585514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vv7\" (UniqueName: \"kubernetes.io/projected/3b4faad2-bf5f-4a3b-97ab-945ac37edd2a-kube-api-access-m5vv7\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-czkvg\" (UID: \"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.698512 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.698449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:29.828506 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.828482 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg"] Apr 16 20:39:29.830470 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:39:29.830440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4faad2_bf5f_4a3b_97ab_945ac37edd2a.slice/crio-7a675b3e67aef9bcdd959b4e4a4d5033eadee4809cb152229c7325b2f09cb2c2 WatchSource:0}: Error finding container 7a675b3e67aef9bcdd959b4e4a4d5033eadee4809cb152229c7325b2f09cb2c2: Status 404 returned error can't find the container with id 7a675b3e67aef9bcdd959b4e4a4d5033eadee4809cb152229c7325b2f09cb2c2 Apr 16 20:39:29.967730 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.967668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" event={"ID":"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a","Type":"ContainerStarted","Data":"3a20acfbee2632699fa2a49417e8ebf4c4338dd4693da140c9058e13053e7b65"} Apr 16 20:39:29.967730 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:29.967698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" event={"ID":"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a","Type":"ContainerStarted","Data":"7a675b3e67aef9bcdd959b4e4a4d5033eadee4809cb152229c7325b2f09cb2c2"} Apr 16 20:39:31.196865 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.196828 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g"] Apr 16 20:39:31.200376 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.200353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.202658 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.202637 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 20:39:31.217180 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.211196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g"] Apr 16 20:39:31.291344 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.291504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291357 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.291504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/daec2562-b4b1-407a-8355-609c9a4b14bd-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.291504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slk4k\" (UniqueName: \"kubernetes.io/projected/daec2562-b4b1-407a-8355-609c9a4b14bd-kube-api-access-slk4k\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.291504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.291504 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.291503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.392723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slk4k\" (UniqueName: \"kubernetes.io/projected/daec2562-b4b1-407a-8355-609c9a4b14bd-kube-api-access-slk4k\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.392879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.392879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.392879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.392879 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.393117 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.392905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/daec2562-b4b1-407a-8355-609c9a4b14bd-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.393293 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.393266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.393293 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.393278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.393489 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.393462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.395329 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.395304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/daec2562-b4b1-407a-8355-609c9a4b14bd-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.395425 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.395400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/daec2562-b4b1-407a-8355-609c9a4b14bd-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.400378 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.400356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slk4k\" (UniqueName: \"kubernetes.io/projected/daec2562-b4b1-407a-8355-609c9a4b14bd-kube-api-access-slk4k\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-6nz9g\" (UID: \"daec2562-b4b1-407a-8355-609c9a4b14bd\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.512487 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.512423 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:31.647288 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.647263 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g"] Apr 16 20:39:31.649808 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:39:31.649765 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaec2562_b4b1_407a_8355_609c9a4b14bd.slice/crio-c403c471b0f1b752a2c001ccd49992bdc216fc033fa07544f3e6dcaf15b8692d WatchSource:0}: Error finding container c403c471b0f1b752a2c001ccd49992bdc216fc033fa07544f3e6dcaf15b8692d: Status 404 returned error can't find the container with id c403c471b0f1b752a2c001ccd49992bdc216fc033fa07544f3e6dcaf15b8692d Apr 16 20:39:31.975275 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.975243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" event={"ID":"daec2562-b4b1-407a-8355-609c9a4b14bd","Type":"ContainerStarted","Data":"92898511e19bbed95609a227f93f94b22d8faab269b40a98a1a0759e48effd7c"} Apr 16 20:39:31.975275 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:31.975280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" event={"ID":"daec2562-b4b1-407a-8355-609c9a4b14bd","Type":"ContainerStarted","Data":"c403c471b0f1b752a2c001ccd49992bdc216fc033fa07544f3e6dcaf15b8692d"} Apr 16 20:39:38.000029 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:37.999998 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b4faad2-bf5f-4a3b-97ab-945ac37edd2a" containerID="3a20acfbee2632699fa2a49417e8ebf4c4338dd4693da140c9058e13053e7b65" exitCode=0 Apr 16 20:39:38.000404 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:38.000077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" event={"ID":"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a","Type":"ContainerDied","Data":"3a20acfbee2632699fa2a49417e8ebf4c4338dd4693da140c9058e13053e7b65"} Apr 16 20:39:38.001623 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:38.001601 2578 generic.go:358] "Generic (PLEG): container finished" podID="daec2562-b4b1-407a-8355-609c9a4b14bd" containerID="92898511e19bbed95609a227f93f94b22d8faab269b40a98a1a0759e48effd7c" exitCode=0 Apr 16 20:39:38.001686 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:38.001654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" event={"ID":"daec2562-b4b1-407a-8355-609c9a4b14bd","Type":"ContainerDied","Data":"92898511e19bbed95609a227f93f94b22d8faab269b40a98a1a0759e48effd7c"} Apr 16 20:39:39.006187 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.006150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" event={"ID":"3b4faad2-bf5f-4a3b-97ab-945ac37edd2a","Type":"ContainerStarted","Data":"851fa3d398c29b75000288c1f706cde7b43017d1e3e0064d41f6dabd07767871"} Apr 16 20:39:39.006630 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.006357 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:39.010302 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.010274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" event={"ID":"daec2562-b4b1-407a-8355-609c9a4b14bd","Type":"ContainerStarted","Data":"067c9ec00c70ebf61fc83d4c072d663e2c0993c14ab898f31e10c4d74e8fdc91"} Apr 16 20:39:39.010516 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.010484 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:39:39.026127 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.026086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" podStartSLOduration=9.802023618 podStartE2EDuration="10.026073136s" podCreationTimestamp="2026-04-16 20:39:29 +0000 UTC" firstStartedPulling="2026-04-16 20:39:38.000854127 +0000 UTC m=+732.972439106" lastFinishedPulling="2026-04-16 20:39:38.224903653 +0000 UTC m=+733.196488624" observedRunningTime="2026-04-16 20:39:39.024535723 +0000 UTC m=+733.996120714" watchObservedRunningTime="2026-04-16 20:39:39.026073136 +0000 UTC m=+733.997658125" Apr 16 20:39:39.043760 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:39.043716 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" podStartSLOduration=7.846049251 podStartE2EDuration="8.043702095s" podCreationTimestamp="2026-04-16 20:39:31 +0000 UTC" firstStartedPulling="2026-04-16 20:39:38.002192724 +0000 UTC m=+732.973777691" lastFinishedPulling="2026-04-16 20:39:38.199845568 +0000 UTC m=+733.171430535" observedRunningTime="2026-04-16 20:39:39.043296318 +0000 UTC m=+734.014881309" watchObservedRunningTime="2026-04-16 20:39:39.043702095 +0000 UTC m=+734.015287085" Apr 16 20:39:50.026554 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:50.026526 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-czkvg" Apr 16 20:39:50.027369 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:39:50.027350 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-6nz9g" Apr 16 20:41:22.993827 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:22.993710 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:41:22.994465 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:22.994052 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-96d5c496f-6kfhh" podUID="7d116cc3-ef4b-42c4-9230-5b79198281f3" containerName="manager" containerID="cri-o://348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d" gracePeriod=10 Apr 16 20:41:23.233862 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.233841 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:41:23.321866 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.321795 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmr8n\" (UniqueName: \"kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n\") pod \"7d116cc3-ef4b-42c4-9230-5b79198281f3\" (UID: \"7d116cc3-ef4b-42c4-9230-5b79198281f3\") " Apr 16 20:41:23.323668 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.323646 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n" (OuterVolumeSpecName: "kube-api-access-jmr8n") pod "7d116cc3-ef4b-42c4-9230-5b79198281f3" (UID: "7d116cc3-ef4b-42c4-9230-5b79198281f3"). InnerVolumeSpecName "kube-api-access-jmr8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:23.362191 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.362164 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d116cc3-ef4b-42c4-9230-5b79198281f3" containerID="348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d" exitCode=0 Apr 16 20:41:23.362303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.362224 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-6kfhh" Apr 16 20:41:23.362303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.362234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-6kfhh" event={"ID":"7d116cc3-ef4b-42c4-9230-5b79198281f3","Type":"ContainerDied","Data":"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d"} Apr 16 20:41:23.362303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.362278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-6kfhh" event={"ID":"7d116cc3-ef4b-42c4-9230-5b79198281f3","Type":"ContainerDied","Data":"9ea09f376e07e5f7769d996130452e110b82d90ceaab6643f091012be5cbc9d6"} Apr 16 20:41:23.362303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.362298 2578 scope.go:117] "RemoveContainer" containerID="348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d" Apr 16 20:41:23.371096 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.371077 2578 scope.go:117] "RemoveContainer" containerID="348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d" Apr 16 20:41:23.371338 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:41:23.371320 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d\": container with ID starting with 348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d not found: ID does not exist" containerID="348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d" Apr 16 20:41:23.371400 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.371345 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d"} err="failed to get container status \"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d\": rpc error: code = NotFound desc = could not find container \"348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d\": container with ID starting with 348a6501f5267e8ac9f7e812f22211401ecdb5c2ebe146d5362474b0dcc7cd7d not found: ID does not exist" Apr 16 20:41:23.388824 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.388659 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:41:23.388824 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.388704 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-96d5c496f-6kfhh"] Apr 16 20:41:23.422525 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.422504 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jmr8n\" (UniqueName: \"kubernetes.io/projected/7d116cc3-ef4b-42c4-9230-5b79198281f3-kube-api-access-jmr8n\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:41:23.605534 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:41:23.605506 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d116cc3-ef4b-42c4-9230-5b79198281f3" path="/var/lib/kubelet/pods/7d116cc3-ef4b-42c4-9230-5b79198281f3/volumes" Apr 16 20:51:35.913881 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:35.913797 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:51:35.916248 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:35.914112 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" podUID="a319fb83-b031-424e-bd0c-33be18ea0e0f" containerName="manager" containerID="cri-o://addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6" gracePeriod=10 Apr 16 20:51:36.156485 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.156465 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:51:36.298139 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.298068 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxqr\" (UniqueName: \"kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr\") pod \"a319fb83-b031-424e-bd0c-33be18ea0e0f\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " Apr 16 20:51:36.298273 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.298153 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume\") pod \"a319fb83-b031-424e-bd0c-33be18ea0e0f\" (UID: \"a319fb83-b031-424e-bd0c-33be18ea0e0f\") " Apr 16 20:51:36.298479 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.298455 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a319fb83-b031-424e-bd0c-33be18ea0e0f" (UID: "a319fb83-b031-424e-bd0c-33be18ea0e0f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:51:36.300089 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.300062 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr" (OuterVolumeSpecName: "kube-api-access-pcxqr") pod "a319fb83-b031-424e-bd0c-33be18ea0e0f" (UID: "a319fb83-b031-424e-bd0c-33be18ea0e0f"). InnerVolumeSpecName "kube-api-access-pcxqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:51:36.399332 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.399310 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcxqr\" (UniqueName: \"kubernetes.io/projected/a319fb83-b031-424e-bd0c-33be18ea0e0f-kube-api-access-pcxqr\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:51:36.399332 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.399330 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a319fb83-b031-424e-bd0c-33be18ea0e0f-extensions-socket-volume\") on node \"ip-10-0-137-53.ec2.internal\" DevicePath \"\"" Apr 16 20:51:36.403162 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.403136 2578 generic.go:358] "Generic (PLEG): container finished" podID="a319fb83-b031-424e-bd0c-33be18ea0e0f" containerID="addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6" exitCode=0 Apr 16 20:51:36.403274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.403195 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" Apr 16 20:51:36.403274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.403203 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" event={"ID":"a319fb83-b031-424e-bd0c-33be18ea0e0f","Type":"ContainerDied","Data":"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6"} Apr 16 20:51:36.403274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.403239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs" event={"ID":"a319fb83-b031-424e-bd0c-33be18ea0e0f","Type":"ContainerDied","Data":"cead85bdbd4ed56835662d2fae022e9c59b1f164d3980a2c87523af18fcd549e"} Apr 16 20:51:36.403274 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.403260 2578 scope.go:117] "RemoveContainer" containerID="addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6" Apr 16 20:51:36.411661 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.411646 2578 scope.go:117] "RemoveContainer" containerID="addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6" Apr 16 20:51:36.411879 ip-10-0-137-53 kubenswrapper[2578]: E0416 20:51:36.411863 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6\": container with ID starting with addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6 not found: ID does not exist" containerID="addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6" Apr 16 20:51:36.411932 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.411889 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6"} err="failed to get container status \"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6\": rpc error: code = NotFound desc = could not find container \"addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6\": container with ID starting with addbf3f75464f6400527505f0c6a6d84ee33a8b04d9ea19758c8f0b1e57b0bb6 not found: ID does not exist" Apr 16 20:51:36.428293 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.428266 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:51:36.433090 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:36.433069 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-52lbs"] Apr 16 20:51:37.606278 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:51:37.606249 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a319fb83-b031-424e-bd0c-33be18ea0e0f" path="/var/lib/kubelet/pods/a319fb83-b031-424e-bd0c-33be18ea0e0f/volumes" Apr 16 20:52:42.028303 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028267 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4"] Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028583 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d116cc3-ef4b-42c4-9230-5b79198281f3" containerName="manager" Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028594 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d116cc3-ef4b-42c4-9230-5b79198281f3" containerName="manager" Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028606 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a319fb83-b031-424e-bd0c-33be18ea0e0f" containerName="manager" Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028632 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a319fb83-b031-424e-bd0c-33be18ea0e0f" containerName="manager" Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028698 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a319fb83-b031-424e-bd0c-33be18ea0e0f" containerName="manager" Apr 16 20:52:42.028723 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.028707 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d116cc3-ef4b-42c4-9230-5b79198281f3" containerName="manager" Apr 16 20:52:42.031584 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.031568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.034695 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.034670 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:52:42.034823 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.034670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-797v8\"" Apr 16 20:52:42.034823 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.034671 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:52:42.048361 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.048331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4"] Apr 16 20:52:42.105161 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.105135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jl5\" (UniqueName: \"kubernetes.io/projected/2e6d4498-3852-46fc-8435-fa28a21be0a4-kube-api-access-96jl5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.105253 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.105171 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e6d4498-3852-46fc-8435-fa28a21be0a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.206027 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.206004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96jl5\" (UniqueName: \"kubernetes.io/projected/2e6d4498-3852-46fc-8435-fa28a21be0a4-kube-api-access-96jl5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.206140 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.206038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e6d4498-3852-46fc-8435-fa28a21be0a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.206386 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.206364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e6d4498-3852-46fc-8435-fa28a21be0a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.222037 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.222016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jl5\" (UniqueName: \"kubernetes.io/projected/2e6d4498-3852-46fc-8435-fa28a21be0a4-kube-api-access-96jl5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4\" (UID: \"2e6d4498-3852-46fc-8435-fa28a21be0a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.342671 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.342651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.466612 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.466583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4"] Apr 16 20:52:42.468880 ip-10-0-137-53 kubenswrapper[2578]: W0416 20:52:42.468848 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6d4498_3852_46fc_8435_fa28a21be0a4.slice/crio-e7591260ffdb34c8ec2857e23565cc1fbfe40b6b4ad74995385651263c67c8df WatchSource:0}: Error finding container e7591260ffdb34c8ec2857e23565cc1fbfe40b6b4ad74995385651263c67c8df: Status 404 returned error can't find the container with id e7591260ffdb34c8ec2857e23565cc1fbfe40b6b4ad74995385651263c67c8df Apr 16 20:52:42.471163 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.471147 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:52:42.633151 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.633082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" event={"ID":"2e6d4498-3852-46fc-8435-fa28a21be0a4","Type":"ContainerStarted","Data":"b90f48a31edd14a374dd34750234465aaadea50563afda2dc1c956a0178a1f01"} Apr 16 20:52:42.633151 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.633118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" event={"ID":"2e6d4498-3852-46fc-8435-fa28a21be0a4","Type":"ContainerStarted","Data":"e7591260ffdb34c8ec2857e23565cc1fbfe40b6b4ad74995385651263c67c8df"} Apr 16 20:52:42.633311 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.633240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 20:52:42.654260 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:42.654222 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" podStartSLOduration=0.654208452 podStartE2EDuration="654.208452ms" podCreationTimestamp="2026-04-16 20:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:52:42.653304835 +0000 UTC m=+1517.624889837" watchObservedRunningTime="2026-04-16 20:52:42.654208452 +0000 UTC m=+1517.625793442" Apr 16 20:52:53.638924 ip-10-0-137-53 kubenswrapper[2578]: I0416 20:52:53.638886 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4" Apr 16 21:02:27.185602 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:27.185569 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-cqtn2_c16e2366-e1de-43df-9748-293440238351/manager/0.log" Apr 16 21:02:27.303372 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:27.303344 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-664d5d4d98-jtpgq_e473e5bc-9222-4108-ab8f-d7caeffcfb06/maas-api/0.log" Apr 16 21:02:27.536430 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:27.536365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-czrzg_7ec644c2-6575-433d-bff7-4602a99b9453/manager/2.log" Apr 16 21:02:27.943957 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:27.943913 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-fkh94_83f05402-452a-4fe4-b250-fb4c7575b53c/manager/0.log" Apr 16 21:02:28.125986 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:28.125937 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-n7bdg_59849ccb-da40-46da-bade-0bf06395a88f/postgres/0.log" Apr 16 21:02:30.029786 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:30.029754 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4_2e6d4498-3852-46fc-8435-fa28a21be0a4/manager/0.log" Apr 16 21:02:30.778125 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:30.778095 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-72z8n_03faef76-6068-411c-8010-ca4f5dcfafe0/discovery/0.log" Apr 16 21:02:30.893775 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:30.893751 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-86666bf97-6kwjn_9fe0099e-dd6d-420b-aba3-0e68ed045212/kube-auth-proxy/0.log" Apr 16 21:02:31.564399 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.564369 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5_d6a360e9-c092-45d7-912b-af887cdbe6a1/storage-initializer/0.log" Apr 16 21:02:31.573742 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.573724 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-ql8b5_d6a360e9-c092-45d7-912b-af887cdbe6a1/main/0.log" Apr 16 21:02:31.681405 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.681382 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-p8mzr_6636e968-8530-4940-b604-5f950383d282/storage-initializer/0.log" Apr 16 21:02:31.693984 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.693964 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-p8mzr_6636e968-8530-4940-b604-5f950383d282/main/0.log" Apr 16 21:02:31.801993 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.801967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-6nz9g_daec2562-b4b1-407a-8355-609c9a4b14bd/storage-initializer/0.log" Apr 16 21:02:31.810780 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.810762 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-6nz9g_daec2562-b4b1-407a-8355-609c9a4b14bd/main/0.log" Apr 16 21:02:31.916556 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.916535 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq_76ccf9ea-e5b1-4459-a8c0-7ec36631d056/storage-initializer/0.log" Apr 16 21:02:31.925522 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:31.925499 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc8g5wq_76ccf9ea-e5b1-4459-a8c0-7ec36631d056/main/0.log" Apr 16 21:02:32.041580 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:32.041559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd_8cc0b816-a429-4048-bf65-225f17afeec5/main/0.log" Apr 16 21:02:32.054671 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:32.054647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-kcxpd_8cc0b816-a429-4048-bf65-225f17afeec5/storage-initializer/0.log" Apr 16 21:02:32.177732 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:32.177665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-czkvg_3b4faad2-bf5f-4a3b-97ab-945ac37edd2a/storage-initializer/0.log" Apr 16 21:02:32.190430 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:32.190406 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-czkvg_3b4faad2-bf5f-4a3b-97ab-945ac37edd2a/main/0.log" Apr 16 21:02:39.428458 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:39.428419 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-25wtt_469d4528-df9f-44e0-a586-1d85f9f8a444/global-pull-secret-syncer/0.log" Apr 16 21:02:39.630355 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:39.630325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ks4qt_42e24607-0fa8-4092-8d9d-a523e82990ff/konnectivity-agent/0.log" Apr 16 21:02:39.707734 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:39.707667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-53.ec2.internal_dbf50ce40dd2f1fe9119f12e04feaaf5/haproxy/0.log" Apr 16 21:02:44.266885 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:44.266842 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-4kwx4_2e6d4498-3852-46fc-8435-fa28a21be0a4/manager/0.log" Apr 16 21:02:46.286076 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:46.286054 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9h69d_f3252ab7-2575-4775-9f42-d15f54edfc88/node-exporter/0.log" Apr 16 21:02:46.304351 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:46.304327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9h69d_f3252ab7-2575-4775-9f42-d15f54edfc88/kube-rbac-proxy/0.log" Apr 16 21:02:46.326360 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:46.326342 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9h69d_f3252ab7-2575-4775-9f42-d15f54edfc88/init-textfile/0.log" Apr 16 21:02:47.945721 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.945685 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st"] Apr 16 21:02:47.949670 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.949646 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:47.952327 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.952306 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"openshift-service-ca.crt\"" Apr 16 21:02:47.952447 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.952341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hdxxb\"/\"default-dockercfg-brd7j\"" Apr 16 21:02:47.953277 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.953255 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"kube-root-ca.crt\"" Apr 16 21:02:47.955655 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.955636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st"] Apr 16 21:02:47.958393 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:47.958372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-8s6gb_35d37cfc-3882-4be0-ac36-d1be745ae717/networking-console-plugin/0.log" Apr 16 21:02:48.024599 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.024574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-sys\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.024695 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.024626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-podres\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.024695 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.024656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jvb\" (UniqueName: \"kubernetes.io/projected/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-kube-api-access-t2jvb\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.024695 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.024673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-proc\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.024695 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.024691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-lib-modules\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125735 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-podres\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125827 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jvb\" (UniqueName: \"kubernetes.io/projected/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-kube-api-access-t2jvb\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125827 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-proc\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125827 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-lib-modules\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125969 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-sys\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125969 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-podres\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125969 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-proc\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.125969 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-sys\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.126098 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.125972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-lib-modules\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.134126 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.134099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jvb\" (UniqueName: \"kubernetes.io/projected/1c44db84-ea9d-4bb7-804d-7d8c29fc2b75-kube-api-access-t2jvb\") pod \"perf-node-gather-daemonset-hz7st\" (UID: \"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.261168 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.261109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.386003 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.385980 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st"] Apr 16 21:02:48.388481 ip-10-0-137-53 kubenswrapper[2578]: W0416 21:02:48.388452 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c44db84_ea9d_4bb7_804d_7d8c29fc2b75.slice/crio-9423cb8d4258b495b88c2cc9b60bb845fb8d7ed0872d327dc2552763a37c9b35 WatchSource:0}: Error finding container 9423cb8d4258b495b88c2cc9b60bb845fb8d7ed0872d327dc2552763a37c9b35: Status 404 returned error can't find the container with id 9423cb8d4258b495b88c2cc9b60bb845fb8d7ed0872d327dc2552763a37c9b35 Apr 16 21:02:48.390422 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.390400 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:02:48.659523 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.659481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" event={"ID":"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75","Type":"ContainerStarted","Data":"8cc489b7ba58dc8ee154c086543d94d7d3e0db1c486f09962de955ff6577a693"} Apr 16 21:02:48.659523 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.659526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" event={"ID":"1c44db84-ea9d-4bb7-804d-7d8c29fc2b75","Type":"ContainerStarted","Data":"9423cb8d4258b495b88c2cc9b60bb845fb8d7ed0872d327dc2552763a37c9b35"} Apr 16 21:02:48.659722 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.659627 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:48.676365 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:48.676326 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" podStartSLOduration=1.6763120900000001 podStartE2EDuration="1.67631209s" podCreationTimestamp="2026-04-16 21:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:02:48.675357413 +0000 UTC m=+2123.646942403" watchObservedRunningTime="2026-04-16 21:02:48.67631209 +0000 UTC m=+2123.647897080" Apr 16 21:02:50.221329 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:50.221304 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s64xd_8171246b-7193-44d0-9905-129be29085dd/dns/0.log" Apr 16 21:02:50.239818 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:50.239793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s64xd_8171246b-7193-44d0-9905-129be29085dd/kube-rbac-proxy/0.log" Apr 16 21:02:50.351824 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:50.349970 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ltdrj_10df2057-e291-4486-a0e1-89f26c4dbee9/dns-node-resolver/0.log" Apr 16 21:02:50.832581 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:50.832556 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l6t5h_ae461ccf-29e2-4d21-9a23-605ab7aac065/node-ca/0.log" Apr 16 21:02:51.899707 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:51.899682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-72z8n_03faef76-6068-411c-8010-ca4f5dcfafe0/discovery/0.log" Apr 16 21:02:51.918904 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:51.918881 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-86666bf97-6kwjn_9fe0099e-dd6d-420b-aba3-0e68ed045212/kube-auth-proxy/0.log" Apr 16 21:02:52.570225 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:52.570198 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fn9dd_45d5c33b-d7b8-47d4-a44d-4e9b49120004/serve-healthcheck-canary/0.log" Apr 16 21:02:53.162519 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:53.162495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9lzt_a3949667-fdd7-46b8-822b-3b5fcf7c291e/kube-rbac-proxy/0.log" Apr 16 21:02:53.183738 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:53.183706 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9lzt_a3949667-fdd7-46b8-822b-3b5fcf7c291e/exporter/0.log" Apr 16 21:02:53.205293 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:53.205271 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9lzt_a3949667-fdd7-46b8-822b-3b5fcf7c291e/extractor/0.log" Apr 16 21:02:54.672877 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:54.672848 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-hz7st" Apr 16 21:02:55.212005 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.211982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-cqtn2_c16e2366-e1de-43df-9748-293440238351/manager/0.log" Apr 16 21:02:55.253694 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.253668 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-664d5d4d98-jtpgq_e473e5bc-9222-4108-ab8f-d7caeffcfb06/maas-api/0.log" Apr 16 21:02:55.327385 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.327366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-czrzg_7ec644c2-6575-433d-bff7-4602a99b9453/manager/1.log" Apr 16 21:02:55.349423 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.349402 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-czrzg_7ec644c2-6575-433d-bff7-4602a99b9453/manager/2.log" Apr 16 21:02:55.454202 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.454183 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-fkh94_83f05402-452a-4fe4-b250-fb4c7575b53c/manager/0.log" Apr 16 21:02:55.478388 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:02:55.478306 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-n7bdg_59849ccb-da40-46da-bade-0bf06395a88f/postgres/0.log" Apr 16 21:03:02.697319 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.697286 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8v2qr_8ab3b913-84db-4ba1-b661-7f31e898d4c1/kube-multus/0.log" Apr 16 21:03:02.888404 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.888381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/kube-multus-additional-cni-plugins/0.log" Apr 16 21:03:02.907845 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.907818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/egress-router-binary-copy/0.log" Apr 16 21:03:02.930503 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.930485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/cni-plugins/0.log" Apr 16 21:03:02.955093 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.955036 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/bond-cni-plugin/0.log" Apr 16 21:03:02.975993 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.975973 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/routeoverride-cni/0.log" Apr 16 21:03:02.998683 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:02.998666 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/whereabouts-cni-bincopy/0.log" Apr 16 21:03:03.019843 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:03.019824 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5lrp_151f9e87-8756-4668-8ff7-d2417f6d4658/whereabouts-cni/0.log" Apr 16 21:03:03.347775 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:03.347734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tjgd4_931ff401-6150-4e87-828a-2e3a9242e1bc/network-metrics-daemon/0.log" Apr 16 21:03:03.367090 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:03.367050 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tjgd4_931ff401-6150-4e87-828a-2e3a9242e1bc/kube-rbac-proxy/0.log" Apr 16 21:03:04.755504 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.755461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/ovn-controller/0.log" Apr 16 21:03:04.793024 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.792999 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/ovn-acl-logging/0.log" Apr 16 21:03:04.815687 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.815667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/kube-rbac-proxy-node/0.log" Apr 16 21:03:04.836596 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.836564 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:03:04.860256 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.860225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/northd/0.log" Apr 16 21:03:04.878892 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.878873 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/nbdb/0.log" Apr 16 21:03:04.898953 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:04.898927 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/sbdb/0.log" Apr 16 21:03:05.060599 ip-10-0-137-53 kubenswrapper[2578]: I0416 21:03:05.060540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsr25_e4de5fe5-38a8-4fca-b8aa-9b340811b2f5/ovnkube-controller/0.log"